Apr 23 08:14:10.931498 ip-10-0-135-129 systemd[1]: Starting Kubernetes Kubelet... Apr 23 08:14:11.414845 ip-10-0-135-129 kubenswrapper[2570]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 08:14:11.414845 ip-10-0-135-129 kubenswrapper[2570]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 23 08:14:11.414845 ip-10-0-135-129 kubenswrapper[2570]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 08:14:11.414845 ip-10-0-135-129 kubenswrapper[2570]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 23 08:14:11.414845 ip-10-0-135-129 kubenswrapper[2570]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 08:14:11.414845 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.367697 2570 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 23 08:14:11.414845 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.370859 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 08:14:11.414845 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371014 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 08:14:11.414845 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371019 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 08:14:11.414845 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371022 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 08:14:11.414845 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371025 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 08:14:11.414845 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371029 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 08:14:11.414845 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371032 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 08:14:11.414845 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371035 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 08:14:11.415863 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371037 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 08:14:11.415863 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371040 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 08:14:11.415863 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371043 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 08:14:11.415863 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371048 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 08:14:11.415863 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371053 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 08:14:11.415863 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371057 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 08:14:11.415863 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371060 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 08:14:11.415863 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371063 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 08:14:11.415863 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371067 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 08:14:11.415863 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371069 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 08:14:11.415863 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371073 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 08:14:11.415863 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371075 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 08:14:11.415863 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371078 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 08:14:11.415863 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371081 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 08:14:11.415863 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371083 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 08:14:11.415863 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371086 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 08:14:11.415863 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371089 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 08:14:11.415863 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371093 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 08:14:11.415863 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371095 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 08:14:11.416493 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371098 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 08:14:11.416493 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371104 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 08:14:11.416493 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371115 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 08:14:11.416493 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371118 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 08:14:11.416493 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371120 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 08:14:11.416493 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371123 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 08:14:11.416493 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371126 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 08:14:11.416493 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371129 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 08:14:11.416493 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371132 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 08:14:11.416493 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371134 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 08:14:11.416493 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371137 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 08:14:11.416493 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371142 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 08:14:11.416493 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371145 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 08:14:11.416493 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371148 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 08:14:11.416493 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371152 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 08:14:11.416493 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371155 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 08:14:11.416493 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371158 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 08:14:11.416493 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371161 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 08:14:11.416493 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371164 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 08:14:11.417154 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371166 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 08:14:11.417154 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371170 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 08:14:11.417154 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371173 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 08:14:11.417154 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371175 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 08:14:11.417154 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371178 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 08:14:11.417154 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371180 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 08:14:11.417154 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371183 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 08:14:11.417154 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371186 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 08:14:11.417154 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371189 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 08:14:11.417154 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371206 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 08:14:11.417154 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371209 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 08:14:11.417154 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371212 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 08:14:11.417154 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371216 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 08:14:11.417154 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371218 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 08:14:11.417154 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371221 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 08:14:11.417154 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371224 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 08:14:11.417154 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371226 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 08:14:11.417154 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371229 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 08:14:11.417154 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371233 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 08:14:11.417732 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371236 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 08:14:11.417732 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371238 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 08:14:11.417732 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371241 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 08:14:11.417732 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371244 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 08:14:11.417732 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371247 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 08:14:11.417732 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371250 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 08:14:11.417732 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371253 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 08:14:11.417732 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371255 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 08:14:11.417732 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371258 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 08:14:11.417732 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371262 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 08:14:11.417732 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371265 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 08:14:11.417732 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371268 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 08:14:11.417732 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371271 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 08:14:11.417732 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371273 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 23 08:14:11.417732 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371276 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 08:14:11.417732 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371279 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 08:14:11.417732 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371282 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 08:14:11.417732 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371284 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 08:14:11.417732 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371288 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 08:14:11.417732 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371291 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 08:14:11.418525 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371294 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 08:14:11.418525 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371719 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 08:14:11.418525 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371725 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 08:14:11.418525 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371728 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 08:14:11.418525 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371731 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 08:14:11.418525 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371734 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 08:14:11.418525 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371737 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 08:14:11.418525 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371740 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 08:14:11.418525 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371743 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 08:14:11.418525 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371746 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 08:14:11.418525 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371749 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 08:14:11.418525 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371751 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 08:14:11.418525 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371754 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 08:14:11.418525 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371757 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 08:14:11.418525 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371759 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 08:14:11.418525 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371762 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 08:14:11.418525 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371764 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 08:14:11.418525 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371767 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 08:14:11.418525 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371770 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 08:14:11.418525 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371772 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 08:14:11.419629 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371775 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 08:14:11.419629 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371778 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 08:14:11.419629 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371780 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 08:14:11.419629 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371782 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 08:14:11.419629 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371785 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 08:14:11.419629 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371788 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 08:14:11.419629 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371791 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 08:14:11.419629 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371794 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 08:14:11.419629 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371797 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 08:14:11.419629 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371800 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 08:14:11.419629 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371802 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 08:14:11.419629 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371805 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 08:14:11.419629 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371808 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 08:14:11.419629 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371811 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 23 08:14:11.419629 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371813 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 08:14:11.419629 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371816 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 08:14:11.419629 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371818 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 08:14:11.419629 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371821 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 08:14:11.419629 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371823 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 08:14:11.419629 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371826 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 08:14:11.422473 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371829 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 08:14:11.422473 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371831 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 08:14:11.422473 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371834 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 08:14:11.422473 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371837 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 08:14:11.422473 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371839 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 08:14:11.422473 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371841 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 08:14:11.422473 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371844 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 08:14:11.422473 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371846 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 08:14:11.422473 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371849 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 08:14:11.422473 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371851 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 08:14:11.422473 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371854 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 08:14:11.422473 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371856 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 08:14:11.422473 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371859 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 08:14:11.422473 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371861 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 08:14:11.422473 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371863 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 08:14:11.422473 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371866 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 08:14:11.422473 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371868 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 08:14:11.422473 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371871 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 08:14:11.422473 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371875 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 08:14:11.422473 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371877 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 08:14:11.423075 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371880 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 08:14:11.423075 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371883 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 08:14:11.423075 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371886 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 08:14:11.423075 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371889 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 08:14:11.423075 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371891 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 08:14:11.423075 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371894 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 08:14:11.423075 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371896 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 08:14:11.423075 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371899 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 08:14:11.423075 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371901 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 08:14:11.423075 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371904 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 08:14:11.423075 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371906 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 08:14:11.423075 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371909 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 08:14:11.423075 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371911 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 08:14:11.423075 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371914 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 08:14:11.423075 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371916 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 08:14:11.423075 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371919 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 08:14:11.423075 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371921 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 08:14:11.423075 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371929 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 08:14:11.423075 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371933 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 08:14:11.423822 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371935 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 08:14:11.423822 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371940 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 08:14:11.423822 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371942 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 08:14:11.423822 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371945 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 08:14:11.423822 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371947 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 08:14:11.423822 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371950 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 08:14:11.423822 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371955 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 08:14:11.423822 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.371958 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 08:14:11.423822 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372027 2570 flags.go:64] FLAG: --address="0.0.0.0" Apr 23 08:14:11.423822 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372034 2570 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 23 08:14:11.423822 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372042 2570 flags.go:64] FLAG: --anonymous-auth="true" Apr 23 08:14:11.423822 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372046 2570 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 23 08:14:11.423822 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372052 2570 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 23 08:14:11.423822 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372055 2570 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 23 08:14:11.423822 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372060 2570 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 23 08:14:11.423822 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372064 2570 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 23 08:14:11.423822 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372067 2570 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 23 08:14:11.423822 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372071 2570 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 23 08:14:11.423822 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372074 2570 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 23 08:14:11.423822 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372078 2570 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 23 08:14:11.423822 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372081 2570 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 23 08:14:11.423822 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372083 2570 flags.go:64] FLAG: --cgroup-root="" Apr 23 08:14:11.424552 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372086 2570 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 23 08:14:11.424552 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372090 2570 flags.go:64] FLAG: --client-ca-file="" Apr 23 08:14:11.424552 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372093 2570 flags.go:64] FLAG: --cloud-config="" Apr 23 08:14:11.424552 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372095 2570 flags.go:64] FLAG: --cloud-provider="external" Apr 23 08:14:11.424552 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372098 2570 flags.go:64] FLAG: --cluster-dns="[]" Apr 23 08:14:11.424552 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372102 2570 flags.go:64] FLAG: --cluster-domain="" Apr 23 08:14:11.424552 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372105 2570 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 23 08:14:11.424552 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372108 2570 flags.go:64] FLAG: --config-dir="" Apr 23 08:14:11.424552 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372112 2570 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 23 08:14:11.424552 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372116 2570 flags.go:64] FLAG: --container-log-max-files="5" Apr 23 08:14:11.424552 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372120 2570 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 23 08:14:11.424552 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372124 2570 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 23 08:14:11.424552 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372127 2570 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 23 08:14:11.424552 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372130 2570 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 23 08:14:11.424552 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372133 2570 flags.go:64] FLAG: --contention-profiling="false" Apr 23 08:14:11.424552 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372136 2570 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 23 08:14:11.424552 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372139 2570 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 23 08:14:11.424552 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372142 2570 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 23 08:14:11.424552 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372145 2570 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 23 08:14:11.424552 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372154 2570 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 23 08:14:11.424552 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372157 2570 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 23 08:14:11.424552 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372160 2570 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 23 08:14:11.424552 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372163 2570 flags.go:64] FLAG: --enable-load-reader="false" Apr 23 08:14:11.424552 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372167 2570 flags.go:64] FLAG: --enable-server="true" Apr 23 08:14:11.424552 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372170 2570 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 23 08:14:11.634693 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372175 2570 flags.go:64] FLAG: --event-burst="100" Apr 23 08:14:11.634693 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372178 2570 flags.go:64] FLAG: --event-qps="50" Apr 23 08:14:11.634693 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372181 2570 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 23 08:14:11.634693 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372184 2570 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 23 08:14:11.634693 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372187 2570 flags.go:64] FLAG: --eviction-hard="" Apr 23 08:14:11.634693 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372207 2570 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 23 08:14:11.634693 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372211 2570 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 23 08:14:11.634693 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372214 2570 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 23 08:14:11.634693 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372217 2570 flags.go:64] FLAG: --eviction-soft="" Apr 23 08:14:11.634693 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372220 2570 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 23 08:14:11.634693 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372223 2570 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 23 08:14:11.634693 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372225 2570 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 23 08:14:11.634693 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372228 2570 flags.go:64] FLAG: --experimental-mounter-path="" Apr 23 08:14:11.634693 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372231 2570 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 23 08:14:11.634693 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372234 2570 flags.go:64] FLAG: --fail-swap-on="true" Apr 23 08:14:11.634693 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372238 2570 flags.go:64] FLAG: --feature-gates="" Apr 23 08:14:11.634693 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372243 2570 flags.go:64] FLAG: --file-check-frequency="20s" Apr 23 08:14:11.634693 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372246 2570 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 23 08:14:11.634693 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372250 2570 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 23 08:14:11.634693 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372253 2570 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 23 08:14:11.634693 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372256 2570 flags.go:64] FLAG: --healthz-port="10248" Apr 23 08:14:11.634693 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372259 2570 flags.go:64] FLAG: --help="false" Apr 23 08:14:11.634693 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372262 2570 flags.go:64] FLAG: --hostname-override="ip-10-0-135-129.ec2.internal" Apr 23 08:14:11.634693 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372265 2570 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 23 08:14:11.634693 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372268 2570 flags.go:64] FLAG: --http-check-frequency="20s" Apr 23 08:14:11.476216 ip-10-0-135-129 systemd[1]: Started Kubernetes Kubelet. Apr 23 08:14:11.657598 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372271 2570 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 23 08:14:11.657598 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372275 2570 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 23 08:14:11.657598 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372278 2570 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 23 08:14:11.657598 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372281 2570 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 23 08:14:11.657598 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372284 2570 flags.go:64] FLAG: --image-service-endpoint="" Apr 23 08:14:11.657598 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372286 2570 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 23 08:14:11.657598 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372291 2570 flags.go:64] FLAG: --kube-api-burst="100" Apr 23 08:14:11.657598 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372294 2570 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 23 08:14:11.657598 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372297 2570 flags.go:64] FLAG: --kube-api-qps="50" Apr 23 08:14:11.657598 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372300 2570 flags.go:64] FLAG: --kube-reserved="" Apr 23 08:14:11.657598 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372303 2570 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 23 08:14:11.657598 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372305 2570 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 23 08:14:11.657598 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372309 2570 flags.go:64] FLAG: --kubelet-cgroups="" Apr 23 08:14:11.657598 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372312 2570 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 23 08:14:11.657598 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372314 2570 flags.go:64] FLAG: --lock-file="" Apr 23 08:14:11.657598 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372317 2570 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 23 08:14:11.657598 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372320 2570 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 23 08:14:11.657598 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372323 2570 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 23 08:14:11.657598 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372328 2570 flags.go:64] FLAG: --log-json-split-stream="false" Apr 23 08:14:11.657598 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372331 2570 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 23 08:14:11.657598 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372334 2570 flags.go:64] FLAG: --log-text-split-stream="false" Apr 23 08:14:11.657598 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372337 2570 flags.go:64] FLAG: --logging-format="text" Apr 23 08:14:11.657598 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372341 2570 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 23 08:14:11.658690 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372344 2570 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 23 08:14:11.658690 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372347 2570 flags.go:64] FLAG: --manifest-url="" Apr 23 08:14:11.658690 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372351 2570 flags.go:64] FLAG: --manifest-url-header="" Apr 23 08:14:11.658690 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372355 2570 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 23 08:14:11.658690 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372358 2570 flags.go:64] FLAG: --max-open-files="1000000" Apr 23 08:14:11.658690 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372362 2570 flags.go:64] FLAG: --max-pods="110" Apr 23 08:14:11.658690 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372365 2570 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 23 08:14:11.658690 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372368 2570 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 23 08:14:11.658690 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372371 2570 flags.go:64] FLAG: --memory-manager-policy="None" Apr 23 08:14:11.658690 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372374 2570 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 23 08:14:11.658690 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372377 2570 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 23 08:14:11.658690 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372380 2570 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 23 08:14:11.658690 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372383 2570 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 23 08:14:11.658690 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372390 2570 flags.go:64] FLAG: --node-status-max-images="50" Apr 23 08:14:11.658690 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372393 2570 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 23 08:14:11.658690 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372396 2570 flags.go:64] FLAG: --oom-score-adj="-999" Apr 23 08:14:11.658690 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372400 2570 flags.go:64] FLAG: --pod-cidr="" Apr 23 08:14:11.658690 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372403 2570 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 23 08:14:11.658690 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372408 2570 flags.go:64] FLAG: --pod-manifest-path="" Apr 23 08:14:11.658690 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372411 2570 flags.go:64] FLAG: --pod-max-pids="-1" Apr 23 08:14:11.658690 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372414 2570 flags.go:64] FLAG: --pods-per-core="0" Apr 23 08:14:11.658690 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372417 2570 flags.go:64] FLAG: --port="10250" Apr 23 08:14:11.658690 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372420 2570 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 23 08:14:11.658690 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372423 2570 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0b427365cd9d583de" Apr 23 08:14:11.659550 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372426 2570 flags.go:64] FLAG: --qos-reserved="" Apr 23 08:14:11.659550 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372429 2570 flags.go:64] FLAG: --read-only-port="10255" Apr 23 08:14:11.659550 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372432 2570 flags.go:64] FLAG: --register-node="true" Apr 23 08:14:11.659550 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372435 2570 flags.go:64] FLAG: --register-schedulable="true" Apr 23 08:14:11.659550 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372438 2570 flags.go:64] FLAG: --register-with-taints="" Apr 23 08:14:11.659550 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372441 2570 flags.go:64] FLAG: --registry-burst="10" Apr 23 08:14:11.659550 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372444 2570 flags.go:64] FLAG: --registry-qps="5" Apr 23 08:14:11.659550 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372447 2570 flags.go:64] FLAG: --reserved-cpus="" Apr 23 08:14:11.659550 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372451 2570 flags.go:64] FLAG: --reserved-memory="" Apr 23 08:14:11.659550 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372455 2570 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 23 08:14:11.659550 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372460 2570 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 23 08:14:11.659550 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372463 2570 flags.go:64] FLAG: --rotate-certificates="false" Apr 23 08:14:11.659550 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372466 2570 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 23 08:14:11.659550 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372469 2570 flags.go:64] FLAG: --runonce="false" Apr 23 08:14:11.659550 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372472 2570 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 23 08:14:11.659550 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372475 2570 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 23 08:14:11.659550 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372478 2570 flags.go:64] FLAG: --seccomp-default="false" Apr 23 08:14:11.659550 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372481 2570 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 23 08:14:11.659550 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372484 2570 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 23 08:14:11.659550 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372487 2570 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 23 08:14:11.659550 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372490 2570 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 23 08:14:11.659550 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372493 2570 flags.go:64] FLAG: --storage-driver-password="root" Apr 23 08:14:11.659550 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372496 2570 flags.go:64] FLAG: --storage-driver-secure="false" Apr 23 08:14:11.659550 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372499 2570 flags.go:64] FLAG: --storage-driver-table="stats" Apr 23 08:14:11.659550 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372502 2570 flags.go:64] FLAG: --storage-driver-user="root" Apr 23 08:14:11.659550 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372505 2570 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 23 08:14:11.660595 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372508 2570 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 23 08:14:11.660595 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372511 2570 flags.go:64] FLAG: --system-cgroups="" Apr 23 08:14:11.660595 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372514 2570 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 23 08:14:11.660595 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372519 2570 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 23 08:14:11.660595 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372522 2570 flags.go:64] FLAG: --tls-cert-file="" Apr 23 08:14:11.660595 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372525 2570 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 23 08:14:11.660595 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372529 2570 flags.go:64] FLAG: --tls-min-version="" Apr 23 08:14:11.660595 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372532 2570 flags.go:64] FLAG: --tls-private-key-file="" Apr 23 08:14:11.660595 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372535 2570 flags.go:64] FLAG: --topology-manager-policy="none" Apr 23 08:14:11.660595 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372537 2570 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 23 08:14:11.660595 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372540 2570 flags.go:64] FLAG: --topology-manager-scope="container" Apr 23 08:14:11.660595 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372543 2570 flags.go:64] FLAG: --v="2" Apr 23 08:14:11.660595 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372547 2570 flags.go:64] FLAG: --version="false" Apr 23 08:14:11.660595 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372551 2570 flags.go:64] FLAG: --vmodule="" Apr 23 08:14:11.660595 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372557 2570 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 23 08:14:11.660595 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.372560 2570 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 23 08:14:11.660595 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.372657 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 08:14:11.660595 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.372661 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 08:14:11.660595 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.372664 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 08:14:11.660595 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.372667 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 08:14:11.660595 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.372670 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 08:14:11.660595 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.372673 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 08:14:11.660595 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.372676 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 08:14:11.660595 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.372679 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 08:14:11.661368 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.372682 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 08:14:11.661368 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.372684 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 08:14:11.661368 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.372687 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 08:14:11.661368 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.372690 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 08:14:11.661368 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.372693 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 08:14:11.661368 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.372695 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 08:14:11.661368 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.372698 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 08:14:11.661368 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.372701 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 08:14:11.661368 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.372704 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 08:14:11.661368 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.372707 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 08:14:11.661368 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.372710 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 08:14:11.661368 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.372712 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 08:14:11.661368 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.372715 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 08:14:11.661368 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.372719 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 08:14:11.661368 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.372722 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 08:14:11.661368 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.372725 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 08:14:11.661368 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.372728 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 08:14:11.661368 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.372731 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 08:14:11.661368 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.372734 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 08:14:11.662067 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.372736 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 08:14:11.662067 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.372739 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 08:14:11.662067 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.372741 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 08:14:11.662067 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.372745 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 08:14:11.662067 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.372748 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 08:14:11.662067 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.372751 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 08:14:11.662067 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.372755 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 08:14:11.662067 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.372759 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 08:14:11.662067 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.372762 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 08:14:11.662067 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.372764 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 08:14:11.662067 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.372767 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 08:14:11.662067 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.372770 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 08:14:11.662067 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.372772 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 08:14:11.662067 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.372775 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 08:14:11.662067 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.372777 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 08:14:11.662067 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.372780 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 08:14:11.662067 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.372782 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 08:14:11.662067 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.372785 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 08:14:11.662067 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.372787 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 08:14:11.663040 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.372790 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 08:14:11.663040 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.372793 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 08:14:11.663040 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.372796 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 08:14:11.663040 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.372799 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 08:14:11.663040 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.372801 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 08:14:11.663040 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.372804 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 08:14:11.663040 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.372807 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 08:14:11.663040 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.372809 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 08:14:11.663040 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.372812 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 08:14:11.663040 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.372815 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 08:14:11.663040 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.372817 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 08:14:11.663040 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.372820 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 08:14:11.663040 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.372822 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 08:14:11.663040 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.372825 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 08:14:11.663040 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.372827 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 08:14:11.663040 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.372830 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 08:14:11.663040 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.372834 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 08:14:11.663040 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.372836 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 08:14:11.663040 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.372843 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 08:14:11.663843 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.372846 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 08:14:11.663843 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.372849 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 08:14:11.663843 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.372851 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 08:14:11.663843 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.372854 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 08:14:11.663843 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.372856 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 08:14:11.663843 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.372859 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 08:14:11.663843 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.372861 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 08:14:11.663843 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.372864 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 08:14:11.663843 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.372866 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 08:14:11.663843 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.372869 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 08:14:11.663843 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.372872 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 08:14:11.663843 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.372874 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 08:14:11.663843 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.372877 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 23 08:14:11.663843 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.372879 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 08:14:11.663843 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.372881 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 08:14:11.663843 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.372884 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 08:14:11.663843 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.372887 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 08:14:11.663843 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.372890 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 08:14:11.663843 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.372893 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 08:14:11.663843 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.372895 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 08:14:11.664594 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.372898 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 08:14:11.664594 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.373639 2570 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 08:14:11.664594 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.380532 2570 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 23 08:14:11.664594 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.380554 2570 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 23 08:14:11.664594 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.380618 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 08:14:11.664594 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.380626 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 08:14:11.664594 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.380631 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 08:14:11.664594 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.380636 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 08:14:11.664594 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.380641 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 08:14:11.664594 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.380645 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 08:14:11.664594 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.380650 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 08:14:11.664594 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.380654 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 08:14:11.664594 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.380658 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 08:14:11.664594 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.380662 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 08:14:11.664594 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.380666 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 08:14:11.664594 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.380670 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 08:14:11.666222 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.380673 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 08:14:11.666222 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.380677 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 08:14:11.666222 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.380683 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 08:14:11.666222 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.380689 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 08:14:11.666222 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.380694 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 08:14:11.666222 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.380698 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 08:14:11.666222 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.380702 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 08:14:11.666222 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.380706 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 08:14:11.666222 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.380710 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 08:14:11.666222 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.380715 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 08:14:11.666222 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.380719 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 08:14:11.666222 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.380723 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 23 08:14:11.666222 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.380727 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 08:14:11.666222 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.380731 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 08:14:11.666222 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.380735 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 08:14:11.666222 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.380739 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 08:14:11.666222 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.380743 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 08:14:11.666222 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.380747 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 08:14:11.666222 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.380751 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 08:14:11.667017 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.380755 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 08:14:11.667017 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.380759 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 08:14:11.667017 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.380763 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 08:14:11.667017 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.380767 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 08:14:11.667017 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.380771 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 08:14:11.667017 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.380774 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 08:14:11.667017 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.380778 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 08:14:11.667017 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.380781 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 08:14:11.667017 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.380785 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 08:14:11.667017 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.380789 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 08:14:11.667017 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.380793 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 08:14:11.667017 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.380799 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 08:14:11.667017 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.380803 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 08:14:11.667017 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.380807 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 08:14:11.667017 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.380810 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 08:14:11.667017 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.380814 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 08:14:11.667017 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.380818 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 08:14:11.667017 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.380821 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 08:14:11.667017 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.380824 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 08:14:11.667017 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.380827 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 08:14:11.953054 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.380831 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 08:14:11.953054 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.380834 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 08:14:11.953054 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.380843 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 08:14:11.953054 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.380848 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 08:14:11.953054 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.380851 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 08:14:11.953054 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.380855 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 08:14:11.953054 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.380859 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 08:14:11.953054 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.380863 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 08:14:11.953054 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.380867 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 08:14:11.953054 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.380871 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 08:14:11.953054 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.380876 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 08:14:11.953054 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.380879 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 08:14:11.953054 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.380883 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 08:14:11.953054 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.380888 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 08:14:11.953054 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.380892 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 08:14:11.953054 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.380895 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 08:14:11.953054 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.380900 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 08:14:11.953054 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.380903 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 08:14:11.953054 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.380908 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 08:14:11.953741 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.380911 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 08:14:11.953741 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.380915 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 08:14:11.953741 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.380919 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 08:14:11.953741 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.380923 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 08:14:11.953741 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.380927 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 08:14:11.953741 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.380930 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 08:14:11.953741 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.380934 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 08:14:11.953741 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.380938 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 08:14:11.953741 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.380942 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 08:14:11.953741 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.380946 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 08:14:11.953741 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.380950 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 08:14:11.953741 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.380954 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 08:14:11.953741 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.380958 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 08:14:11.953741 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.380962 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 08:14:11.953741 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.380965 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 08:14:11.953741 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.380970 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 08:14:11.954482 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.380978 2570 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 08:14:11.954482 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.381109 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 08:14:11.954482 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.381117 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 08:14:11.954482 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.381122 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 08:14:11.954482 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.381127 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 08:14:11.954482 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.381132 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 08:14:11.954482 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.381136 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 08:14:11.954482 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.381141 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 08:14:11.954482 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.381146 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 08:14:11.954482 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.381151 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 08:14:11.954482 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.381154 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 23 08:14:11.954482 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.381159 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 08:14:11.954482 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.381164 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 08:14:11.954482 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.381168 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 08:14:11.954482 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.381172 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 08:14:11.954893 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.381176 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 08:14:11.954893 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.381180 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 08:14:11.954893 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.381184 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 08:14:11.954893 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.381187 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 08:14:11.954893 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.381210 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 08:14:11.954893 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.381214 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 08:14:11.954893 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.381218 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 08:14:11.954893 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.381222 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 08:14:11.954893 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.381226 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 08:14:11.954893 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.381230 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 08:14:11.954893 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.381234 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 08:14:11.954893 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.381238 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 08:14:11.954893 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.381242 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 08:14:11.954893 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.381246 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 08:14:11.954893 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.381249 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 08:14:11.954893 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.381251 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 08:14:11.954893 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.381255 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 08:14:11.954893 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.381258 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 08:14:11.954893 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.381260 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 08:14:11.954893 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.381263 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 08:14:12.119832 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.381266 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 08:14:12.119832 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.381268 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 08:14:12.119832 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.381273 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 08:14:12.119832 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.381275 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 08:14:12.119832 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.381278 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 08:14:12.119832 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.381281 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 08:14:12.119832 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.381283 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 08:14:12.119832 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.381286 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 08:14:12.119832 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.381289 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 08:14:12.119832 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.381292 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 08:14:12.119832 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.381295 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 08:14:12.119832 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.381298 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 08:14:12.119832 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.381300 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 08:14:12.119832 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.381303 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 08:14:12.119832 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.381305 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 08:14:12.119832 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.381308 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 08:14:12.119832 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.381311 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 08:14:12.119832 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.381313 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 08:14:12.119832 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.381316 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 08:14:12.119832 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.381319 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 08:14:12.121066 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.381321 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 08:14:12.121066 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.381324 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 08:14:12.121066 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.381326 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 08:14:12.121066 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.381328 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 08:14:12.121066 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.381331 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 08:14:12.121066 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.381334 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 08:14:12.121066 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.381336 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 08:14:12.121066 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.381339 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 08:14:12.121066 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.381343 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 08:14:12.121066 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.381345 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 08:14:12.121066 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.381348 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 08:14:12.121066 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.381350 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 08:14:12.121066 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.381353 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 08:14:12.121066 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.381355 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 08:14:12.121066 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.381359 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 08:14:12.121066 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.381362 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 08:14:12.121066 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.381365 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 08:14:12.121066 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.381367 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 08:14:12.121066 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.381370 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 08:14:12.121066 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.381372 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 08:14:12.121666 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.381375 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 08:14:12.121666 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.381377 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 08:14:12.121666 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.381380 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 08:14:12.121666 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.381383 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 08:14:12.121666 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.381385 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 08:14:12.121666 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.381388 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 08:14:12.121666 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.381391 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 08:14:12.121666 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.381393 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 08:14:12.121666 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.381396 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 08:14:12.121666 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.381399 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 08:14:12.121666 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.381402 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 08:14:12.121666 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:11.381404 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 08:14:12.121666 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.381409 2570 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 08:14:12.121666 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.382105 2570 server.go:962] "Client rotation is on, will bootstrap in background" Apr 23 08:14:12.121666 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.385459 2570 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 23 08:14:12.122241 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.386413 2570 server.go:1019] "Starting client certificate rotation" Apr 23 08:14:12.122241 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.386510 2570 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 08:14:12.122241 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.386556 2570 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 08:14:12.122241 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.410573 2570 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 08:14:12.122241 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.416047 2570 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 08:14:12.122241 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.432777 2570 log.go:25] "Validated CRI v1 runtime API" Apr 23 08:14:12.122241 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.438848 2570 log.go:25] "Validated CRI v1 image API" Apr 23 08:14:12.122241 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.440774 2570 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 23 08:14:12.122241 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.444601 2570 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 08:14:12.122241 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.444901 2570 fs.go:135] Filesystem UUIDs: map[3d3f27fd-f949-491a-8e50-0da9c2d31784:/dev/nvme0n1p4 43b7c274-37ed-4002-b732-46a9eca00f72:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2] Apr 23 08:14:12.122241 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.444918 2570 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 23 08:14:12.122763 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.450803 2570 manager.go:217] Machine: {Timestamp:2026-04-23 08:14:11.448688679 +0000 UTC m=+0.396875899 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3101656 MemoryCapacity:33164488704 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2a003f01e55959a80f3c2ee8e6a58a SystemUUID:ec2a003f-01e5-5959-a80f-3c2ee8e6a58a BootID:0bf5afdc-a610-40f8-8453-d23b61a5b15f Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:98:1c:14:0b:cf Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:98:1c:14:0b:cf Speed:0 Mtu:9001} {Name:ovs-system MacAddress:22:01:17:5b:e7:71 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164488704 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 23 08:14:12.122763 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.450903 2570 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 23 08:14:12.122763 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.450982 2570 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 23 08:14:12.122763 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.453741 2570 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 23 08:14:12.122763 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.453768 2570 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-135-129.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 23 08:14:12.122763 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.453918 2570 topology_manager.go:138] "Creating topology manager with none policy" Apr 23 08:14:12.122763 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.453927 2570 container_manager_linux.go:306] "Creating device plugin manager" Apr 23 08:14:12.122763 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.453940 2570 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 08:14:12.122763 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.454672 2570 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 08:14:12.122763 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.456148 2570 state_mem.go:36] "Initialized new in-memory state store" Apr 23 08:14:12.122763 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.456446 2570 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 23 08:14:12.122763 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.458801 2570 kubelet.go:491] "Attempting to sync node with API server" Apr 23 08:14:12.122763 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.458818 2570 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 23 08:14:12.122763 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.458831 2570 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 23 08:14:12.122763 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.458841 2570 kubelet.go:397] "Adding apiserver pod source" Apr 23 08:14:12.122763 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.458850 2570 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 23 08:14:12.122763 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.460021 2570 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 08:14:12.122763 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.460038 2570 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 08:14:12.123346 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.463098 2570 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 23 08:14:12.123346 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.464458 2570 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 23 08:14:12.123346 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.466291 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 23 08:14:12.123346 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.466307 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 23 08:14:12.123346 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.466313 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 23 08:14:12.123346 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.466319 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 23 08:14:12.123346 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.466325 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 23 08:14:12.123346 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.466332 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 23 08:14:12.123346 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.466341 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 23 08:14:12.123346 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.466349 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 23 08:14:12.123346 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.466359 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 23 08:14:12.123346 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.466365 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 23 08:14:12.123346 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.466381 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 23 08:14:12.123346 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.466391 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 23 08:14:12.123346 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.470238 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 23 08:14:12.123346 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.470261 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 23 08:14:12.123346 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:11.474205 2570 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 23 08:14:12.123346 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:11.474247 2570 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-135-129.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 23 08:14:12.123346 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.475063 2570 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 23 08:14:12.123346 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.475302 2570 server.go:1295] "Started kubelet" Apr 23 08:14:12.123346 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.475384 2570 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 23 08:14:12.123346 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.475439 2570 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 23 08:14:12.123346 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.475497 2570 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 23 08:14:12.123346 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.476608 2570 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 23 08:14:12.123346 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.477930 2570 server.go:317] "Adding debug handlers to kubelet server" Apr 23 08:14:12.123346 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.481417 2570 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 23 08:14:12.123346 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.481431 2570 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 23 08:14:12.123346 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.482010 2570 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 23 08:14:12.123346 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.482017 2570 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 23 08:14:12.123346 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.482033 2570 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 23 08:14:12.123346 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.482095 2570 reconstruct.go:97] "Volume reconstruction finished" Apr 23 08:14:12.123346 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.482101 2570 reconciler.go:26] "Reconciler: start to sync state" Apr 23 08:14:12.123346 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:11.482219 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-129.ec2.internal\" not found" Apr 23 08:14:12.123346 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.485647 2570 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 23 08:14:12.123346 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.485659 2570 factory.go:55] Registering systemd factory Apr 23 08:14:12.123346 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.485665 2570 factory.go:223] Registration of the systemd container factory successfully Apr 23 08:14:12.123346 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.486278 2570 factory.go:153] Registering CRI-O factory Apr 23 08:14:12.123346 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.486296 2570 factory.go:223] Registration of the crio container factory successfully Apr 23 08:14:12.123346 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.486319 2570 factory.go:103] Registering Raw factory Apr 23 08:14:12.123346 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.486331 2570 manager.go:1196] Started watching for new ooms in manager Apr 23 08:14:12.123346 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.486686 2570 manager.go:319] Starting recovery of all containers Apr 23 08:14:12.123346 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:11.487019 2570 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-135-129.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 23 08:14:12.123346 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.487065 2570 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-135-129.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 08:14:12.123346 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:11.487253 2570 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 23 08:14:12.123346 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:11.488343 2570 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 23 08:14:12.124547 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:11.487342 2570 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-135-129.ec2.internal.18a8ee4946308cf8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-135-129.ec2.internal,UID:ip-10-0-135-129.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-135-129.ec2.internal,},FirstTimestamp:2026-04-23 08:14:11.475270904 +0000 UTC m=+0.423458125,LastTimestamp:2026-04-23 08:14:11.475270904 +0000 UTC m=+0.423458125,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-135-129.ec2.internal,}" Apr 23 08:14:12.124547 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.496986 2570 manager.go:324] Recovery completed Apr 23 08:14:12.124547 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.498882 2570 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-vj4qn" Apr 23 08:14:12.124547 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.502865 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 08:14:12.124547 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.505521 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-129.ec2.internal" event="NodeHasSufficientMemory" Apr 23 08:14:12.124547 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.505548 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-129.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 08:14:12.124547 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.505561 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-129.ec2.internal" event="NodeHasSufficientPID" Apr 23 08:14:12.124547 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.506068 2570 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 23 08:14:12.124547 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.506079 2570 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 23 08:14:12.124547 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.506096 2570 state_mem.go:36] "Initialized new in-memory state store" Apr 23 08:14:12.124547 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.506620 2570 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-vj4qn" Apr 23 08:14:12.124547 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:11.507699 2570 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-135-129.ec2.internal.18a8ee4947fe51b2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-135-129.ec2.internal,UID:ip-10-0-135-129.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-135-129.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-135-129.ec2.internal,},FirstTimestamp:2026-04-23 08:14:11.505533362 +0000 UTC m=+0.453720582,LastTimestamp:2026-04-23 08:14:11.505533362 +0000 UTC m=+0.453720582,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-135-129.ec2.internal,}" Apr 23 08:14:12.124547 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.508555 2570 policy_none.go:49] "None policy: Start" Apr 23 08:14:12.124547 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.508567 2570 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 23 08:14:12.124547 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.508577 2570 state_mem.go:35] "Initializing new in-memory state store" Apr 23 08:14:12.124547 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.548765 2570 manager.go:341] "Starting Device Plugin manager" Apr 23 08:14:12.124547 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:11.548796 2570 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 23 08:14:12.124547 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.548808 2570 server.go:85] "Starting device plugin registration server" Apr 23 08:14:12.124547 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.549093 2570 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 23 08:14:12.124547 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.549107 2570 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 23 08:14:12.124547 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.549213 2570 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 23 08:14:12.124547 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.549303 2570 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 23 08:14:12.124547 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.549311 2570 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 23 08:14:12.124547 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:11.549893 2570 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 23 08:14:12.124547 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:11.549922 2570 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-135-129.ec2.internal\" not found" Apr 23 08:14:12.124547 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.582666 2570 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 23 08:14:12.124547 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.584187 2570 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 23 08:14:12.124547 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.584786 2570 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 23 08:14:12.124547 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.585184 2570 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 23 08:14:12.124547 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.585207 2570 kubelet.go:2451] "Starting kubelet main sync loop" Apr 23 08:14:12.124547 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:11.585243 2570 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 23 08:14:12.124547 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.588346 2570 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 08:14:12.124547 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.649584 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 08:14:12.124547 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.653039 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-129.ec2.internal" event="NodeHasSufficientMemory" Apr 23 08:14:12.124547 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.653067 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-129.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 08:14:12.124547 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.653077 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-129.ec2.internal" event="NodeHasSufficientPID" Apr 23 08:14:12.125626 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.653101 2570 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-135-129.ec2.internal" Apr 23 08:14:12.125626 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.663344 2570 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-135-129.ec2.internal" Apr 23 08:14:12.125626 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:11.663372 2570 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-135-129.ec2.internal\": node \"ip-10-0-135-129.ec2.internal\" not found" Apr 23 08:14:12.125626 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:11.675693 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-129.ec2.internal\" not found" Apr 23 08:14:12.125626 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.685993 2570 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-129.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-135-129.ec2.internal"] Apr 23 08:14:12.125626 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.686053 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 08:14:12.125626 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.686965 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-129.ec2.internal" event="NodeHasSufficientMemory" Apr 23 08:14:12.125626 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.686993 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-129.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 08:14:12.125626 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.687006 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-129.ec2.internal" event="NodeHasSufficientPID" Apr 23 08:14:12.125626 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.688222 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 08:14:12.125626 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.688356 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-129.ec2.internal" Apr 23 08:14:12.125626 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.688386 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 08:14:12.125626 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.689010 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-129.ec2.internal" event="NodeHasSufficientMemory" Apr 23 08:14:12.125626 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.689015 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-129.ec2.internal" event="NodeHasSufficientMemory" Apr 23 08:14:12.125626 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.689034 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-129.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 08:14:12.125626 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.689041 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-129.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 08:14:12.125626 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.689050 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-129.ec2.internal" event="NodeHasSufficientPID" Apr 23 08:14:12.125626 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.689056 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-129.ec2.internal" event="NodeHasSufficientPID" Apr 23 08:14:12.125626 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.689974 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-129.ec2.internal" Apr 23 08:14:12.125626 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.689994 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 08:14:12.125626 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.690650 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-129.ec2.internal" event="NodeHasSufficientMemory" Apr 23 08:14:12.125626 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.690682 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-129.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 08:14:12.125626 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.690722 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-129.ec2.internal" event="NodeHasSufficientPID" Apr 23 08:14:12.125626 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:11.718862 2570 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-135-129.ec2.internal\" not found" node="ip-10-0-135-129.ec2.internal" Apr 23 08:14:12.125626 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:11.723413 2570 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-135-129.ec2.internal\" not found" node="ip-10-0-135-129.ec2.internal" Apr 23 08:14:12.125626 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:11.776723 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-129.ec2.internal\" not found" Apr 23 08:14:12.125626 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.783074 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/65646df45a52d0346b7d7e22ae4cefe7-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-129.ec2.internal\" (UID: \"65646df45a52d0346b7d7e22ae4cefe7\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-129.ec2.internal" Apr 23 08:14:12.125626 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.783103 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/65646df45a52d0346b7d7e22ae4cefe7-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-129.ec2.internal\" (UID: \"65646df45a52d0346b7d7e22ae4cefe7\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-129.ec2.internal" Apr 23 08:14:12.125626 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.783127 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/a24ff0e0a63ca37f66f4f5e5712330cc-config\") pod \"kube-apiserver-proxy-ip-10-0-135-129.ec2.internal\" (UID: \"a24ff0e0a63ca37f66f4f5e5712330cc\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-129.ec2.internal" Apr 23 08:14:12.125626 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:11.877466 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-129.ec2.internal\" not found" Apr 23 08:14:12.125626 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.883718 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/65646df45a52d0346b7d7e22ae4cefe7-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-129.ec2.internal\" (UID: \"65646df45a52d0346b7d7e22ae4cefe7\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-129.ec2.internal" Apr 23 08:14:12.126460 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.883745 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/a24ff0e0a63ca37f66f4f5e5712330cc-config\") pod \"kube-apiserver-proxy-ip-10-0-135-129.ec2.internal\" (UID: \"a24ff0e0a63ca37f66f4f5e5712330cc\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-129.ec2.internal" Apr 23 08:14:12.126460 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.883762 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/65646df45a52d0346b7d7e22ae4cefe7-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-129.ec2.internal\" (UID: \"65646df45a52d0346b7d7e22ae4cefe7\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-129.ec2.internal" Apr 23 08:14:12.126460 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.883808 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/a24ff0e0a63ca37f66f4f5e5712330cc-config\") pod \"kube-apiserver-proxy-ip-10-0-135-129.ec2.internal\" (UID: \"a24ff0e0a63ca37f66f4f5e5712330cc\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-129.ec2.internal" Apr 23 08:14:12.126460 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.883820 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/65646df45a52d0346b7d7e22ae4cefe7-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-129.ec2.internal\" (UID: \"65646df45a52d0346b7d7e22ae4cefe7\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-129.ec2.internal" Apr 23 08:14:12.126460 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:11.883848 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/65646df45a52d0346b7d7e22ae4cefe7-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-129.ec2.internal\" (UID: \"65646df45a52d0346b7d7e22ae4cefe7\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-129.ec2.internal" Apr 23 08:14:12.126460 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:11.978153 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-129.ec2.internal\" not found" Apr 23 08:14:12.126460 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:12.020631 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-129.ec2.internal" Apr 23 08:14:12.126460 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:12.026139 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-129.ec2.internal" Apr 23 08:14:12.126460 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:12.078671 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-129.ec2.internal\" not found" Apr 23 08:14:12.179304 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:12.179261 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-129.ec2.internal\" not found" Apr 23 08:14:12.279835 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:12.279747 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-129.ec2.internal\" not found" Apr 23 08:14:12.380364 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:12.380328 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-129.ec2.internal\" not found" Apr 23 08:14:12.386655 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:12.386633 2570 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 23 08:14:12.386788 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:12.386772 2570 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 23 08:14:12.481278 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:12.481239 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-129.ec2.internal\" not found" Apr 23 08:14:12.482354 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:12.482333 2570 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 23 08:14:12.499814 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:12.499788 2570 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 08:14:12.508843 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:12.508806 2570 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-22 08:09:11 +0000 UTC" deadline="2027-11-09 00:37:44.445013579 +0000 UTC" Apr 23 08:14:12.508843 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:12.508842 2570 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13552h23m31.936175218s" Apr 23 08:14:12.519619 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:12.519594 2570 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 08:14:12.520403 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:12.520380 2570 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-925gz" Apr 23 08:14:12.529626 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:12.529605 2570 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-925gz" Apr 23 08:14:12.581686 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:12.581662 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-129.ec2.internal\" not found" Apr 23 08:14:12.586285 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:12.586256 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod65646df45a52d0346b7d7e22ae4cefe7.slice/crio-785670fedad49ace88b4475b1928f0f74c13ce871f2a6c21acb745d074952739 WatchSource:0}: Error finding container 785670fedad49ace88b4475b1928f0f74c13ce871f2a6c21acb745d074952739: Status 404 returned error can't find the container with id 785670fedad49ace88b4475b1928f0f74c13ce871f2a6c21acb745d074952739 Apr 23 08:14:12.586507 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:12.586468 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda24ff0e0a63ca37f66f4f5e5712330cc.slice/crio-73a8a6fca9b60dbb4b738f92d5a3f92740ca55d46c55cb3383ab057862402ff8 WatchSource:0}: Error finding container 73a8a6fca9b60dbb4b738f92d5a3f92740ca55d46c55cb3383ab057862402ff8: Status 404 returned error can't find the container with id 73a8a6fca9b60dbb4b738f92d5a3f92740ca55d46c55cb3383ab057862402ff8 Apr 23 08:14:12.591552 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:12.591537 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 08:14:12.598008 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:12.597991 2570 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 08:14:12.648857 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:12.648828 2570 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 08:14:12.682564 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:12.682517 2570 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-129.ec2.internal" Apr 23 08:14:12.697989 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:12.697962 2570 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 08:14:12.698904 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:12.698891 2570 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-129.ec2.internal" Apr 23 08:14:12.704902 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:12.704889 2570 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 08:14:13.394700 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.394612 2570 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 08:14:13.460560 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.460505 2570 apiserver.go:52] "Watching apiserver" Apr 23 08:14:13.465575 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.465545 2570 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 23 08:14:13.467809 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.467783 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-129.ec2.internal","openshift-multus/multus-additional-cni-plugins-rctnw","openshift-multus/network-metrics-daemon-4gzjb","openshift-image-registry/node-ca-fgmqg","openshift-multus/multus-hrjxz","openshift-network-diagnostics/network-check-target-wrt6p","openshift-network-operator/iptables-alerter-7wvqt","openshift-ovn-kubernetes/ovnkube-node-vsvjc","kube-system/konnectivity-agent-z2ckf","kube-system/kube-apiserver-proxy-ip-10-0-135-129.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wdj9s","openshift-cluster-node-tuning-operator/tuned-cxzv6","openshift-dns/node-resolver-mslvd"] Apr 23 08:14:13.469711 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.469693 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-7wvqt" Apr 23 08:14:13.470735 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.470717 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-rctnw" Apr 23 08:14:13.471784 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.471763 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4gzjb" Apr 23 08:14:13.471874 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:13.471839 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4gzjb" podUID="5e2ddc62-96a8-4827-87cc-a0b37c9a5e9f" Apr 23 08:14:13.472334 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.472315 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 23 08:14:13.472428 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.472315 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 23 08:14:13.472428 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.472412 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-zn7rr\"" Apr 23 08:14:13.472534 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.472315 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 23 08:14:13.472948 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.472917 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 23 08:14:13.473045 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.473028 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 23 08:14:13.473106 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.473043 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-swx5j\"" Apr 23 08:14:13.473189 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.473173 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 23 08:14:13.473352 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.473303 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 23 08:14:13.473451 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.473370 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 23 08:14:13.473866 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.473837 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-fgmqg" Apr 23 08:14:13.475422 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.475044 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-hrjxz" Apr 23 08:14:13.475422 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.475140 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wrt6p" Apr 23 08:14:13.475422 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.475309 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-gk9j6\"" Apr 23 08:14:13.475422 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:13.475332 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wrt6p" podUID="93ac6a8e-10aa-4687-be43-6d712bee9ebd" Apr 23 08:14:13.475671 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.475424 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 23 08:14:13.475671 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.475440 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 23 08:14:13.475671 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.475493 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 23 08:14:13.476572 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.476552 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vsvjc" Apr 23 08:14:13.476671 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.476597 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-czlf9\"" Apr 23 08:14:13.477022 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.476928 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 23 08:14:13.478382 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.478360 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 23 08:14:13.478382 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.478381 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 23 08:14:13.478530 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.478488 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 23 08:14:13.478530 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.478502 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 23 08:14:13.478530 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.478527 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 23 08:14:13.478674 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.478664 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 23 08:14:13.478802 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.478783 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-t9gsk\"" Apr 23 08:14:13.479439 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.479369 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-z2ckf" Apr 23 08:14:13.479542 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.479471 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wdj9s" Apr 23 08:14:13.481120 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.481081 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 23 08:14:13.481360 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.481342 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-6rgd8\"" Apr 23 08:14:13.481723 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.481348 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 23 08:14:13.481723 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.481470 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 23 08:14:13.481723 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.481701 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 23 08:14:13.481723 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.481710 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-m8kzb\"" Apr 23 08:14:13.482388 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.482240 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 23 08:14:13.483217 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.482935 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-cxzv6" Apr 23 08:14:13.485755 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.484653 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 23 08:14:13.485755 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.485132 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-z2mkl\"" Apr 23 08:14:13.485755 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.485392 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 23 08:14:13.486394 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.486297 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-mslvd" Apr 23 08:14:13.489371 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.489352 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 23 08:14:13.489805 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.489785 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-k6dm6\"" Apr 23 08:14:13.490334 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.490319 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 23 08:14:13.490828 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.490807 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/fca14dff-5b1c-41d2-adc2-0d42ef722a54-etc-sysctl-d\") pod \"tuned-cxzv6\" (UID: \"fca14dff-5b1c-41d2-adc2-0d42ef722a54\") " pod="openshift-cluster-node-tuning-operator/tuned-cxzv6" Apr 23 08:14:13.490930 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.490840 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fca14dff-5b1c-41d2-adc2-0d42ef722a54-host\") pod \"tuned-cxzv6\" (UID: \"fca14dff-5b1c-41d2-adc2-0d42ef722a54\") " pod="openshift-cluster-node-tuning-operator/tuned-cxzv6" Apr 23 08:14:13.490930 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.490862 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbfh5\" (UniqueName: \"kubernetes.io/projected/f033049a-1b87-451a-a1fc-53b7ebf036df-kube-api-access-sbfh5\") pod \"multus-hrjxz\" (UID: \"f033049a-1b87-451a-a1fc-53b7ebf036df\") " pod="openshift-multus/multus-hrjxz" Apr 23 08:14:13.490930 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.490905 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ffbe5334-bba8-45bc-bd64-2141ea3f49a8-host-cni-netd\") pod \"ovnkube-node-vsvjc\" (UID: \"ffbe5334-bba8-45bc-bd64-2141ea3f49a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvjc" Apr 23 08:14:13.491081 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.490960 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ffbe5334-bba8-45bc-bd64-2141ea3f49a8-ovn-node-metrics-cert\") pod \"ovnkube-node-vsvjc\" (UID: \"ffbe5334-bba8-45bc-bd64-2141ea3f49a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvjc" Apr 23 08:14:13.491081 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.490990 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ffbe5334-bba8-45bc-bd64-2141ea3f49a8-ovnkube-script-lib\") pod \"ovnkube-node-vsvjc\" (UID: \"ffbe5334-bba8-45bc-bd64-2141ea3f49a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvjc" Apr 23 08:14:13.491081 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.491014 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/fca14dff-5b1c-41d2-adc2-0d42ef722a54-etc-sysctl-conf\") pod \"tuned-cxzv6\" (UID: \"fca14dff-5b1c-41d2-adc2-0d42ef722a54\") " pod="openshift-cluster-node-tuning-operator/tuned-cxzv6" Apr 23 08:14:13.491081 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.491040 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/5da47bef-96ab-409c-80f2-3c5cb2356454-socket-dir\") pod \"aws-ebs-csi-driver-node-wdj9s\" (UID: \"5da47bef-96ab-409c-80f2-3c5cb2356454\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wdj9s" Apr 23 08:14:13.491308 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.491082 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/5da47bef-96ab-409c-80f2-3c5cb2356454-sys-fs\") pod \"aws-ebs-csi-driver-node-wdj9s\" (UID: \"5da47bef-96ab-409c-80f2-3c5cb2356454\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wdj9s" Apr 23 08:14:13.491308 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.491125 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/575d4e03-f407-47ca-9efc-8c7bee335d30-cni-binary-copy\") pod \"multus-additional-cni-plugins-rctnw\" (UID: \"575d4e03-f407-47ca-9efc-8c7bee335d30\") " pod="openshift-multus/multus-additional-cni-plugins-rctnw" Apr 23 08:14:13.491308 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.491152 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f033049a-1b87-451a-a1fc-53b7ebf036df-etc-kubernetes\") pod \"multus-hrjxz\" (UID: \"f033049a-1b87-451a-a1fc-53b7ebf036df\") " pod="openshift-multus/multus-hrjxz" Apr 23 08:14:13.491308 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.491171 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/5da47bef-96ab-409c-80f2-3c5cb2356454-device-dir\") pod \"aws-ebs-csi-driver-node-wdj9s\" (UID: \"5da47bef-96ab-409c-80f2-3c5cb2356454\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wdj9s" Apr 23 08:14:13.491308 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.491214 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/fca14dff-5b1c-41d2-adc2-0d42ef722a54-etc-systemd\") pod \"tuned-cxzv6\" (UID: \"fca14dff-5b1c-41d2-adc2-0d42ef722a54\") " pod="openshift-cluster-node-tuning-operator/tuned-cxzv6" Apr 23 08:14:13.491308 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.491231 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lg4tp\" (UniqueName: \"kubernetes.io/projected/575d4e03-f407-47ca-9efc-8c7bee335d30-kube-api-access-lg4tp\") pod \"multus-additional-cni-plugins-rctnw\" (UID: \"575d4e03-f407-47ca-9efc-8c7bee335d30\") " pod="openshift-multus/multus-additional-cni-plugins-rctnw" Apr 23 08:14:13.491308 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.491246 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ffbe5334-bba8-45bc-bd64-2141ea3f49a8-host-kubelet\") pod \"ovnkube-node-vsvjc\" (UID: \"ffbe5334-bba8-45bc-bd64-2141ea3f49a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvjc" Apr 23 08:14:13.491308 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.491265 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ffbe5334-bba8-45bc-bd64-2141ea3f49a8-systemd-units\") pod \"ovnkube-node-vsvjc\" (UID: \"ffbe5334-bba8-45bc-bd64-2141ea3f49a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvjc" Apr 23 08:14:13.491308 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.491279 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fca14dff-5b1c-41d2-adc2-0d42ef722a54-etc-kubernetes\") pod \"tuned-cxzv6\" (UID: \"fca14dff-5b1c-41d2-adc2-0d42ef722a54\") " pod="openshift-cluster-node-tuning-operator/tuned-cxzv6" Apr 23 08:14:13.491308 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.491302 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fca14dff-5b1c-41d2-adc2-0d42ef722a54-tmp\") pod \"tuned-cxzv6\" (UID: \"fca14dff-5b1c-41d2-adc2-0d42ef722a54\") " pod="openshift-cluster-node-tuning-operator/tuned-cxzv6" Apr 23 08:14:13.491753 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.491328 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/575d4e03-f407-47ca-9efc-8c7bee335d30-system-cni-dir\") pod \"multus-additional-cni-plugins-rctnw\" (UID: \"575d4e03-f407-47ca-9efc-8c7bee335d30\") " pod="openshift-multus/multus-additional-cni-plugins-rctnw" Apr 23 08:14:13.491753 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.491362 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/575d4e03-f407-47ca-9efc-8c7bee335d30-os-release\") pod \"multus-additional-cni-plugins-rctnw\" (UID: \"575d4e03-f407-47ca-9efc-8c7bee335d30\") " pod="openshift-multus/multus-additional-cni-plugins-rctnw" Apr 23 08:14:13.491753 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.491390 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/749e4605-b37e-4004-a2dc-68092884ddae-serviceca\") pod \"node-ca-fgmqg\" (UID: \"749e4605-b37e-4004-a2dc-68092884ddae\") " pod="openshift-image-registry/node-ca-fgmqg" Apr 23 08:14:13.491753 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.491414 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/75c70767-827d-46b1-acb4-c76aff02f4bd-konnectivity-ca\") pod \"konnectivity-agent-z2ckf\" (UID: \"75c70767-827d-46b1-acb4-c76aff02f4bd\") " pod="kube-system/konnectivity-agent-z2ckf" Apr 23 08:14:13.491753 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.491436 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5da47bef-96ab-409c-80f2-3c5cb2356454-kubelet-dir\") pod \"aws-ebs-csi-driver-node-wdj9s\" (UID: \"5da47bef-96ab-409c-80f2-3c5cb2356454\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wdj9s" Apr 23 08:14:13.491753 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.491460 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mfl4\" (UniqueName: \"kubernetes.io/projected/5e2ddc62-96a8-4827-87cc-a0b37c9a5e9f-kube-api-access-8mfl4\") pod \"network-metrics-daemon-4gzjb\" (UID: \"5e2ddc62-96a8-4827-87cc-a0b37c9a5e9f\") " pod="openshift-multus/network-metrics-daemon-4gzjb" Apr 23 08:14:13.491753 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.491482 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fca14dff-5b1c-41d2-adc2-0d42ef722a54-sys\") pod \"tuned-cxzv6\" (UID: \"fca14dff-5b1c-41d2-adc2-0d42ef722a54\") " pod="openshift-cluster-node-tuning-operator/tuned-cxzv6" Apr 23 08:14:13.491753 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.491508 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29srk\" (UniqueName: \"kubernetes.io/projected/fca14dff-5b1c-41d2-adc2-0d42ef722a54-kube-api-access-29srk\") pod \"tuned-cxzv6\" (UID: \"fca14dff-5b1c-41d2-adc2-0d42ef722a54\") " pod="openshift-cluster-node-tuning-operator/tuned-cxzv6" Apr 23 08:14:13.491753 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.491534 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/887c87c3-07ae-4b74-aa64-fe19546746e0-iptables-alerter-script\") pod \"iptables-alerter-7wvqt\" (UID: \"887c87c3-07ae-4b74-aa64-fe19546746e0\") " pod="openshift-network-operator/iptables-alerter-7wvqt" Apr 23 08:14:13.491753 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.491559 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/575d4e03-f407-47ca-9efc-8c7bee335d30-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-rctnw\" (UID: \"575d4e03-f407-47ca-9efc-8c7bee335d30\") " pod="openshift-multus/multus-additional-cni-plugins-rctnw" Apr 23 08:14:13.491753 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.491583 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f033049a-1b87-451a-a1fc-53b7ebf036df-cni-binary-copy\") pod \"multus-hrjxz\" (UID: \"f033049a-1b87-451a-a1fc-53b7ebf036df\") " pod="openshift-multus/multus-hrjxz" Apr 23 08:14:13.491753 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.491620 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/fca14dff-5b1c-41d2-adc2-0d42ef722a54-etc-modprobe-d\") pod \"tuned-cxzv6\" (UID: \"fca14dff-5b1c-41d2-adc2-0d42ef722a54\") " pod="openshift-cluster-node-tuning-operator/tuned-cxzv6" Apr 23 08:14:13.491753 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.491648 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dswlg\" (UniqueName: \"kubernetes.io/projected/887c87c3-07ae-4b74-aa64-fe19546746e0-kube-api-access-dswlg\") pod \"iptables-alerter-7wvqt\" (UID: \"887c87c3-07ae-4b74-aa64-fe19546746e0\") " pod="openshift-network-operator/iptables-alerter-7wvqt" Apr 23 08:14:13.491753 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.491673 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/575d4e03-f407-47ca-9efc-8c7bee335d30-tuning-conf-dir\") pod \"multus-additional-cni-plugins-rctnw\" (UID: \"575d4e03-f407-47ca-9efc-8c7bee335d30\") " pod="openshift-multus/multus-additional-cni-plugins-rctnw" Apr 23 08:14:13.491753 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.491697 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f033049a-1b87-451a-a1fc-53b7ebf036df-multus-cni-dir\") pod \"multus-hrjxz\" (UID: \"f033049a-1b87-451a-a1fc-53b7ebf036df\") " pod="openshift-multus/multus-hrjxz" Apr 23 08:14:13.491753 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.491756 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f033049a-1b87-451a-a1fc-53b7ebf036df-host-run-multus-certs\") pod \"multus-hrjxz\" (UID: \"f033049a-1b87-451a-a1fc-53b7ebf036df\") " pod="openshift-multus/multus-hrjxz" Apr 23 08:14:13.492508 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.491792 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ffbe5334-bba8-45bc-bd64-2141ea3f49a8-var-lib-openvswitch\") pod \"ovnkube-node-vsvjc\" (UID: \"ffbe5334-bba8-45bc-bd64-2141ea3f49a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvjc" Apr 23 08:14:13.492508 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.491816 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/fca14dff-5b1c-41d2-adc2-0d42ef722a54-run\") pod \"tuned-cxzv6\" (UID: \"fca14dff-5b1c-41d2-adc2-0d42ef722a54\") " pod="openshift-cluster-node-tuning-operator/tuned-cxzv6" Apr 23 08:14:13.492508 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.491839 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/887c87c3-07ae-4b74-aa64-fe19546746e0-host-slash\") pod \"iptables-alerter-7wvqt\" (UID: \"887c87c3-07ae-4b74-aa64-fe19546746e0\") " pod="openshift-network-operator/iptables-alerter-7wvqt" Apr 23 08:14:13.492508 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.491864 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/575d4e03-f407-47ca-9efc-8c7bee335d30-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-rctnw\" (UID: \"575d4e03-f407-47ca-9efc-8c7bee335d30\") " pod="openshift-multus/multus-additional-cni-plugins-rctnw" Apr 23 08:14:13.492508 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.491888 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/749e4605-b37e-4004-a2dc-68092884ddae-host\") pod \"node-ca-fgmqg\" (UID: \"749e4605-b37e-4004-a2dc-68092884ddae\") " pod="openshift-image-registry/node-ca-fgmqg" Apr 23 08:14:13.492508 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.491906 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f033049a-1b87-451a-a1fc-53b7ebf036df-hostroot\") pod \"multus-hrjxz\" (UID: \"f033049a-1b87-451a-a1fc-53b7ebf036df\") " pod="openshift-multus/multus-hrjxz" Apr 23 08:14:13.492508 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.491946 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f033049a-1b87-451a-a1fc-53b7ebf036df-multus-conf-dir\") pod \"multus-hrjxz\" (UID: \"f033049a-1b87-451a-a1fc-53b7ebf036df\") " pod="openshift-multus/multus-hrjxz" Apr 23 08:14:13.492508 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.491980 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ffbe5334-bba8-45bc-bd64-2141ea3f49a8-host-run-netns\") pod \"ovnkube-node-vsvjc\" (UID: \"ffbe5334-bba8-45bc-bd64-2141ea3f49a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvjc" Apr 23 08:14:13.492508 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.492015 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ffbe5334-bba8-45bc-bd64-2141ea3f49a8-etc-openvswitch\") pod \"ovnkube-node-vsvjc\" (UID: \"ffbe5334-bba8-45bc-bd64-2141ea3f49a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvjc" Apr 23 08:14:13.492508 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.492044 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f033049a-1b87-451a-a1fc-53b7ebf036df-system-cni-dir\") pod \"multus-hrjxz\" (UID: \"f033049a-1b87-451a-a1fc-53b7ebf036df\") " pod="openshift-multus/multus-hrjxz" Apr 23 08:14:13.492508 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.492059 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f033049a-1b87-451a-a1fc-53b7ebf036df-host-run-k8s-cni-cncf-io\") pod \"multus-hrjxz\" (UID: \"f033049a-1b87-451a-a1fc-53b7ebf036df\") " pod="openshift-multus/multus-hrjxz" Apr 23 08:14:13.492508 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.492074 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f033049a-1b87-451a-a1fc-53b7ebf036df-host-var-lib-cni-multus\") pod \"multus-hrjxz\" (UID: \"f033049a-1b87-451a-a1fc-53b7ebf036df\") " pod="openshift-multus/multus-hrjxz" Apr 23 08:14:13.492508 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.492089 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f033049a-1b87-451a-a1fc-53b7ebf036df-host-var-lib-kubelet\") pod \"multus-hrjxz\" (UID: \"f033049a-1b87-451a-a1fc-53b7ebf036df\") " pod="openshift-multus/multus-hrjxz" Apr 23 08:14:13.492508 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.492103 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ffbe5334-bba8-45bc-bd64-2141ea3f49a8-log-socket\") pod \"ovnkube-node-vsvjc\" (UID: \"ffbe5334-bba8-45bc-bd64-2141ea3f49a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvjc" Apr 23 08:14:13.492508 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.492117 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ffbe5334-bba8-45bc-bd64-2141ea3f49a8-env-overrides\") pod \"ovnkube-node-vsvjc\" (UID: \"ffbe5334-bba8-45bc-bd64-2141ea3f49a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvjc" Apr 23 08:14:13.492508 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.492145 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fca14dff-5b1c-41d2-adc2-0d42ef722a54-lib-modules\") pod \"tuned-cxzv6\" (UID: \"fca14dff-5b1c-41d2-adc2-0d42ef722a54\") " pod="openshift-cluster-node-tuning-operator/tuned-cxzv6" Apr 23 08:14:13.492508 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.492171 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ffbe5334-bba8-45bc-bd64-2141ea3f49a8-node-log\") pod \"ovnkube-node-vsvjc\" (UID: \"ffbe5334-bba8-45bc-bd64-2141ea3f49a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvjc" Apr 23 08:14:13.493348 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.492209 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ffbe5334-bba8-45bc-bd64-2141ea3f49a8-ovnkube-config\") pod \"ovnkube-node-vsvjc\" (UID: \"ffbe5334-bba8-45bc-bd64-2141ea3f49a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvjc" Apr 23 08:14:13.493348 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.492228 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsgfg\" (UniqueName: \"kubernetes.io/projected/93ac6a8e-10aa-4687-be43-6d712bee9ebd-kube-api-access-tsgfg\") pod \"network-check-target-wrt6p\" (UID: \"93ac6a8e-10aa-4687-be43-6d712bee9ebd\") " pod="openshift-network-diagnostics/network-check-target-wrt6p" Apr 23 08:14:13.493348 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.492252 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/fca14dff-5b1c-41d2-adc2-0d42ef722a54-etc-sysconfig\") pod \"tuned-cxzv6\" (UID: \"fca14dff-5b1c-41d2-adc2-0d42ef722a54\") " pod="openshift-cluster-node-tuning-operator/tuned-cxzv6" Apr 23 08:14:13.493348 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.492276 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5e2ddc62-96a8-4827-87cc-a0b37c9a5e9f-metrics-certs\") pod \"network-metrics-daemon-4gzjb\" (UID: \"5e2ddc62-96a8-4827-87cc-a0b37c9a5e9f\") " pod="openshift-multus/network-metrics-daemon-4gzjb" Apr 23 08:14:13.493348 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.492298 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/5da47bef-96ab-409c-80f2-3c5cb2356454-registration-dir\") pod \"aws-ebs-csi-driver-node-wdj9s\" (UID: \"5da47bef-96ab-409c-80f2-3c5cb2356454\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wdj9s" Apr 23 08:14:13.493348 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.492319 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/5da47bef-96ab-409c-80f2-3c5cb2356454-etc-selinux\") pod \"aws-ebs-csi-driver-node-wdj9s\" (UID: \"5da47bef-96ab-409c-80f2-3c5cb2356454\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wdj9s" Apr 23 08:14:13.493348 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.492341 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f033049a-1b87-451a-a1fc-53b7ebf036df-cnibin\") pod \"multus-hrjxz\" (UID: \"f033049a-1b87-451a-a1fc-53b7ebf036df\") " pod="openshift-multus/multus-hrjxz" Apr 23 08:14:13.493348 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.492361 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ffbe5334-bba8-45bc-bd64-2141ea3f49a8-host-slash\") pod \"ovnkube-node-vsvjc\" (UID: \"ffbe5334-bba8-45bc-bd64-2141ea3f49a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvjc" Apr 23 08:14:13.493348 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.492397 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ffbe5334-bba8-45bc-bd64-2141ea3f49a8-run-systemd\") pod \"ovnkube-node-vsvjc\" (UID: \"ffbe5334-bba8-45bc-bd64-2141ea3f49a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvjc" Apr 23 08:14:13.493348 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.492428 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/575d4e03-f407-47ca-9efc-8c7bee335d30-cnibin\") pod \"multus-additional-cni-plugins-rctnw\" (UID: \"575d4e03-f407-47ca-9efc-8c7bee335d30\") " pod="openshift-multus/multus-additional-cni-plugins-rctnw" Apr 23 08:14:13.493348 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.492452 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csb9l\" (UniqueName: \"kubernetes.io/projected/749e4605-b37e-4004-a2dc-68092884ddae-kube-api-access-csb9l\") pod \"node-ca-fgmqg\" (UID: \"749e4605-b37e-4004-a2dc-68092884ddae\") " pod="openshift-image-registry/node-ca-fgmqg" Apr 23 08:14:13.493348 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.492470 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ffbe5334-bba8-45bc-bd64-2141ea3f49a8-run-ovn\") pod \"ovnkube-node-vsvjc\" (UID: \"ffbe5334-bba8-45bc-bd64-2141ea3f49a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvjc" Apr 23 08:14:13.493348 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.492483 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ffbe5334-bba8-45bc-bd64-2141ea3f49a8-host-cni-bin\") pod \"ovnkube-node-vsvjc\" (UID: \"ffbe5334-bba8-45bc-bd64-2141ea3f49a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvjc" Apr 23 08:14:13.493348 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.492498 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f033049a-1b87-451a-a1fc-53b7ebf036df-multus-socket-dir-parent\") pod \"multus-hrjxz\" (UID: \"f033049a-1b87-451a-a1fc-53b7ebf036df\") " pod="openshift-multus/multus-hrjxz" Apr 23 08:14:13.493348 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.492512 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ffbe5334-bba8-45bc-bd64-2141ea3f49a8-host-run-ovn-kubernetes\") pod \"ovnkube-node-vsvjc\" (UID: \"ffbe5334-bba8-45bc-bd64-2141ea3f49a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvjc" Apr 23 08:14:13.493348 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.492533 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f033049a-1b87-451a-a1fc-53b7ebf036df-host-var-lib-cni-bin\") pod \"multus-hrjxz\" (UID: \"f033049a-1b87-451a-a1fc-53b7ebf036df\") " pod="openshift-multus/multus-hrjxz" Apr 23 08:14:13.494056 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.492565 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/fca14dff-5b1c-41d2-adc2-0d42ef722a54-etc-tuned\") pod \"tuned-cxzv6\" (UID: \"fca14dff-5b1c-41d2-adc2-0d42ef722a54\") " pod="openshift-cluster-node-tuning-operator/tuned-cxzv6" Apr 23 08:14:13.494056 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.492589 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/75c70767-827d-46b1-acb4-c76aff02f4bd-agent-certs\") pod \"konnectivity-agent-z2ckf\" (UID: \"75c70767-827d-46b1-acb4-c76aff02f4bd\") " pod="kube-system/konnectivity-agent-z2ckf" Apr 23 08:14:13.494056 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.492606 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ffbe5334-bba8-45bc-bd64-2141ea3f49a8-run-openvswitch\") pod \"ovnkube-node-vsvjc\" (UID: \"ffbe5334-bba8-45bc-bd64-2141ea3f49a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvjc" Apr 23 08:14:13.494056 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.492625 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f033049a-1b87-451a-a1fc-53b7ebf036df-os-release\") pod \"multus-hrjxz\" (UID: \"f033049a-1b87-451a-a1fc-53b7ebf036df\") " pod="openshift-multus/multus-hrjxz" Apr 23 08:14:13.494056 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.492642 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f033049a-1b87-451a-a1fc-53b7ebf036df-host-run-netns\") pod \"multus-hrjxz\" (UID: \"f033049a-1b87-451a-a1fc-53b7ebf036df\") " pod="openshift-multus/multus-hrjxz" Apr 23 08:14:13.494056 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.492666 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f033049a-1b87-451a-a1fc-53b7ebf036df-multus-daemon-config\") pod \"multus-hrjxz\" (UID: \"f033049a-1b87-451a-a1fc-53b7ebf036df\") " pod="openshift-multus/multus-hrjxz" Apr 23 08:14:13.494056 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.492695 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvqth\" (UniqueName: \"kubernetes.io/projected/5da47bef-96ab-409c-80f2-3c5cb2356454-kube-api-access-tvqth\") pod \"aws-ebs-csi-driver-node-wdj9s\" (UID: \"5da47bef-96ab-409c-80f2-3c5cb2356454\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wdj9s" Apr 23 08:14:13.494056 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.492717 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fca14dff-5b1c-41d2-adc2-0d42ef722a54-var-lib-kubelet\") pod \"tuned-cxzv6\" (UID: \"fca14dff-5b1c-41d2-adc2-0d42ef722a54\") " pod="openshift-cluster-node-tuning-operator/tuned-cxzv6" Apr 23 08:14:13.494056 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.492740 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ffbe5334-bba8-45bc-bd64-2141ea3f49a8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vsvjc\" (UID: \"ffbe5334-bba8-45bc-bd64-2141ea3f49a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvjc" Apr 23 08:14:13.494056 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.492761 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hg6zz\" (UniqueName: \"kubernetes.io/projected/ffbe5334-bba8-45bc-bd64-2141ea3f49a8-kube-api-access-hg6zz\") pod \"ovnkube-node-vsvjc\" (UID: \"ffbe5334-bba8-45bc-bd64-2141ea3f49a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvjc" Apr 23 08:14:13.531289 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.531259 2570 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 08:09:12 +0000 UTC" deadline="2027-10-30 02:23:33.088117107 +0000 UTC" Apr 23 08:14:13.531289 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.531287 2570 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13314h9m19.556833802s" Apr 23 08:14:13.583036 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.583008 2570 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 23 08:14:13.589027 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.588976 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-129.ec2.internal" event={"ID":"65646df45a52d0346b7d7e22ae4cefe7","Type":"ContainerStarted","Data":"785670fedad49ace88b4475b1928f0f74c13ce871f2a6c21acb745d074952739"} Apr 23 08:14:13.590016 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.589987 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-129.ec2.internal" event={"ID":"a24ff0e0a63ca37f66f4f5e5712330cc","Type":"ContainerStarted","Data":"73a8a6fca9b60dbb4b738f92d5a3f92740ca55d46c55cb3383ab057862402ff8"} Apr 23 08:14:13.593861 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.593840 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f033049a-1b87-451a-a1fc-53b7ebf036df-os-release\") pod \"multus-hrjxz\" (UID: \"f033049a-1b87-451a-a1fc-53b7ebf036df\") " pod="openshift-multus/multus-hrjxz" Apr 23 08:14:13.593986 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.593874 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f033049a-1b87-451a-a1fc-53b7ebf036df-host-run-netns\") pod \"multus-hrjxz\" (UID: \"f033049a-1b87-451a-a1fc-53b7ebf036df\") " pod="openshift-multus/multus-hrjxz" Apr 23 08:14:13.593986 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.593903 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f033049a-1b87-451a-a1fc-53b7ebf036df-multus-daemon-config\") pod \"multus-hrjxz\" (UID: \"f033049a-1b87-451a-a1fc-53b7ebf036df\") " pod="openshift-multus/multus-hrjxz" Apr 23 08:14:13.593986 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.593928 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tvqth\" (UniqueName: \"kubernetes.io/projected/5da47bef-96ab-409c-80f2-3c5cb2356454-kube-api-access-tvqth\") pod \"aws-ebs-csi-driver-node-wdj9s\" (UID: \"5da47bef-96ab-409c-80f2-3c5cb2356454\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wdj9s" Apr 23 08:14:13.593986 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.593952 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fca14dff-5b1c-41d2-adc2-0d42ef722a54-var-lib-kubelet\") pod \"tuned-cxzv6\" (UID: \"fca14dff-5b1c-41d2-adc2-0d42ef722a54\") " pod="openshift-cluster-node-tuning-operator/tuned-cxzv6" Apr 23 08:14:13.593986 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.593972 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ffbe5334-bba8-45bc-bd64-2141ea3f49a8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vsvjc\" (UID: \"ffbe5334-bba8-45bc-bd64-2141ea3f49a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvjc" Apr 23 08:14:13.594228 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.593979 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f033049a-1b87-451a-a1fc-53b7ebf036df-host-run-netns\") pod \"multus-hrjxz\" (UID: \"f033049a-1b87-451a-a1fc-53b7ebf036df\") " pod="openshift-multus/multus-hrjxz" Apr 23 08:14:13.594228 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.593996 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hg6zz\" (UniqueName: \"kubernetes.io/projected/ffbe5334-bba8-45bc-bd64-2141ea3f49a8-kube-api-access-hg6zz\") pod \"ovnkube-node-vsvjc\" (UID: \"ffbe5334-bba8-45bc-bd64-2141ea3f49a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvjc" Apr 23 08:14:13.594228 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.594058 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ffbe5334-bba8-45bc-bd64-2141ea3f49a8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vsvjc\" (UID: \"ffbe5334-bba8-45bc-bd64-2141ea3f49a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvjc" Apr 23 08:14:13.594228 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.594061 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f033049a-1b87-451a-a1fc-53b7ebf036df-os-release\") pod \"multus-hrjxz\" (UID: \"f033049a-1b87-451a-a1fc-53b7ebf036df\") " pod="openshift-multus/multus-hrjxz" Apr 23 08:14:13.594228 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.594062 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fca14dff-5b1c-41d2-adc2-0d42ef722a54-var-lib-kubelet\") pod \"tuned-cxzv6\" (UID: \"fca14dff-5b1c-41d2-adc2-0d42ef722a54\") " pod="openshift-cluster-node-tuning-operator/tuned-cxzv6" Apr 23 08:14:13.594228 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.594066 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/fca14dff-5b1c-41d2-adc2-0d42ef722a54-etc-sysctl-d\") pod \"tuned-cxzv6\" (UID: \"fca14dff-5b1c-41d2-adc2-0d42ef722a54\") " pod="openshift-cluster-node-tuning-operator/tuned-cxzv6" Apr 23 08:14:13.594228 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.594106 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fca14dff-5b1c-41d2-adc2-0d42ef722a54-host\") pod \"tuned-cxzv6\" (UID: \"fca14dff-5b1c-41d2-adc2-0d42ef722a54\") " pod="openshift-cluster-node-tuning-operator/tuned-cxzv6" Apr 23 08:14:13.594228 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.594130 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sbfh5\" (UniqueName: \"kubernetes.io/projected/f033049a-1b87-451a-a1fc-53b7ebf036df-kube-api-access-sbfh5\") pod \"multus-hrjxz\" (UID: \"f033049a-1b87-451a-a1fc-53b7ebf036df\") " pod="openshift-multus/multus-hrjxz" Apr 23 08:14:13.594228 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.594156 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ffbe5334-bba8-45bc-bd64-2141ea3f49a8-host-cni-netd\") pod \"ovnkube-node-vsvjc\" (UID: \"ffbe5334-bba8-45bc-bd64-2141ea3f49a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvjc" Apr 23 08:14:13.594228 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.594184 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ffbe5334-bba8-45bc-bd64-2141ea3f49a8-ovn-node-metrics-cert\") pod \"ovnkube-node-vsvjc\" (UID: \"ffbe5334-bba8-45bc-bd64-2141ea3f49a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvjc" Apr 23 08:14:13.594228 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.594221 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ffbe5334-bba8-45bc-bd64-2141ea3f49a8-ovnkube-script-lib\") pod \"ovnkube-node-vsvjc\" (UID: \"ffbe5334-bba8-45bc-bd64-2141ea3f49a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvjc" Apr 23 08:14:13.594798 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.594241 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ffbe5334-bba8-45bc-bd64-2141ea3f49a8-host-cni-netd\") pod \"ovnkube-node-vsvjc\" (UID: \"ffbe5334-bba8-45bc-bd64-2141ea3f49a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvjc" Apr 23 08:14:13.594798 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.594266 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/fca14dff-5b1c-41d2-adc2-0d42ef722a54-etc-sysctl-d\") pod \"tuned-cxzv6\" (UID: \"fca14dff-5b1c-41d2-adc2-0d42ef722a54\") " pod="openshift-cluster-node-tuning-operator/tuned-cxzv6" Apr 23 08:14:13.594798 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.594510 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f033049a-1b87-451a-a1fc-53b7ebf036df-multus-daemon-config\") pod \"multus-hrjxz\" (UID: \"f033049a-1b87-451a-a1fc-53b7ebf036df\") " pod="openshift-multus/multus-hrjxz" Apr 23 08:14:13.594962 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.594940 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fca14dff-5b1c-41d2-adc2-0d42ef722a54-host\") pod \"tuned-cxzv6\" (UID: \"fca14dff-5b1c-41d2-adc2-0d42ef722a54\") " pod="openshift-cluster-node-tuning-operator/tuned-cxzv6" Apr 23 08:14:13.595020 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.594987 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9f4d9f29-8efc-4799-b966-20f9d049fd33-hosts-file\") pod \"node-resolver-mslvd\" (UID: \"9f4d9f29-8efc-4799-b966-20f9d049fd33\") " pod="openshift-dns/node-resolver-mslvd" Apr 23 08:14:13.595020 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.595013 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/fca14dff-5b1c-41d2-adc2-0d42ef722a54-etc-sysctl-conf\") pod \"tuned-cxzv6\" (UID: \"fca14dff-5b1c-41d2-adc2-0d42ef722a54\") " pod="openshift-cluster-node-tuning-operator/tuned-cxzv6" Apr 23 08:14:13.595122 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.595040 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/5da47bef-96ab-409c-80f2-3c5cb2356454-socket-dir\") pod \"aws-ebs-csi-driver-node-wdj9s\" (UID: \"5da47bef-96ab-409c-80f2-3c5cb2356454\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wdj9s" Apr 23 08:14:13.595173 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.595105 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/5da47bef-96ab-409c-80f2-3c5cb2356454-sys-fs\") pod \"aws-ebs-csi-driver-node-wdj9s\" (UID: \"5da47bef-96ab-409c-80f2-3c5cb2356454\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wdj9s" Apr 23 08:14:13.595173 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.595159 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/575d4e03-f407-47ca-9efc-8c7bee335d30-cni-binary-copy\") pod \"multus-additional-cni-plugins-rctnw\" (UID: \"575d4e03-f407-47ca-9efc-8c7bee335d30\") " pod="openshift-multus/multus-additional-cni-plugins-rctnw" Apr 23 08:14:13.595294 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.595172 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/5da47bef-96ab-409c-80f2-3c5cb2356454-socket-dir\") pod \"aws-ebs-csi-driver-node-wdj9s\" (UID: \"5da47bef-96ab-409c-80f2-3c5cb2356454\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wdj9s" Apr 23 08:14:13.595294 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.595185 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f033049a-1b87-451a-a1fc-53b7ebf036df-etc-kubernetes\") pod \"multus-hrjxz\" (UID: \"f033049a-1b87-451a-a1fc-53b7ebf036df\") " pod="openshift-multus/multus-hrjxz" Apr 23 08:14:13.595294 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.595213 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/fca14dff-5b1c-41d2-adc2-0d42ef722a54-etc-sysctl-conf\") pod \"tuned-cxzv6\" (UID: \"fca14dff-5b1c-41d2-adc2-0d42ef722a54\") " pod="openshift-cluster-node-tuning-operator/tuned-cxzv6" Apr 23 08:14:13.595294 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.595221 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/5da47bef-96ab-409c-80f2-3c5cb2356454-sys-fs\") pod \"aws-ebs-csi-driver-node-wdj9s\" (UID: \"5da47bef-96ab-409c-80f2-3c5cb2356454\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wdj9s" Apr 23 08:14:13.595294 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.595210 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ffbe5334-bba8-45bc-bd64-2141ea3f49a8-ovnkube-script-lib\") pod \"ovnkube-node-vsvjc\" (UID: \"ffbe5334-bba8-45bc-bd64-2141ea3f49a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvjc" Apr 23 08:14:13.595294 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.595247 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/5da47bef-96ab-409c-80f2-3c5cb2356454-device-dir\") pod \"aws-ebs-csi-driver-node-wdj9s\" (UID: \"5da47bef-96ab-409c-80f2-3c5cb2356454\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wdj9s" Apr 23 08:14:13.595294 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.595255 2570 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 23 08:14:13.595294 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.595281 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f033049a-1b87-451a-a1fc-53b7ebf036df-etc-kubernetes\") pod \"multus-hrjxz\" (UID: \"f033049a-1b87-451a-a1fc-53b7ebf036df\") " pod="openshift-multus/multus-hrjxz" Apr 23 08:14:13.595622 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.595323 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/fca14dff-5b1c-41d2-adc2-0d42ef722a54-etc-systemd\") pod \"tuned-cxzv6\" (UID: \"fca14dff-5b1c-41d2-adc2-0d42ef722a54\") " pod="openshift-cluster-node-tuning-operator/tuned-cxzv6" Apr 23 08:14:13.595622 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.595358 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/5da47bef-96ab-409c-80f2-3c5cb2356454-device-dir\") pod \"aws-ebs-csi-driver-node-wdj9s\" (UID: \"5da47bef-96ab-409c-80f2-3c5cb2356454\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wdj9s" Apr 23 08:14:13.595622 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.595413 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/fca14dff-5b1c-41d2-adc2-0d42ef722a54-etc-systemd\") pod \"tuned-cxzv6\" (UID: \"fca14dff-5b1c-41d2-adc2-0d42ef722a54\") " pod="openshift-cluster-node-tuning-operator/tuned-cxzv6" Apr 23 08:14:13.595622 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.595418 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lg4tp\" (UniqueName: \"kubernetes.io/projected/575d4e03-f407-47ca-9efc-8c7bee335d30-kube-api-access-lg4tp\") pod \"multus-additional-cni-plugins-rctnw\" (UID: \"575d4e03-f407-47ca-9efc-8c7bee335d30\") " pod="openshift-multus/multus-additional-cni-plugins-rctnw" Apr 23 08:14:13.595622 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.595454 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ffbe5334-bba8-45bc-bd64-2141ea3f49a8-host-kubelet\") pod \"ovnkube-node-vsvjc\" (UID: \"ffbe5334-bba8-45bc-bd64-2141ea3f49a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvjc" Apr 23 08:14:13.595622 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.595479 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ffbe5334-bba8-45bc-bd64-2141ea3f49a8-systemd-units\") pod \"ovnkube-node-vsvjc\" (UID: \"ffbe5334-bba8-45bc-bd64-2141ea3f49a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvjc" Apr 23 08:14:13.595622 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.595512 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fca14dff-5b1c-41d2-adc2-0d42ef722a54-etc-kubernetes\") pod \"tuned-cxzv6\" (UID: \"fca14dff-5b1c-41d2-adc2-0d42ef722a54\") " pod="openshift-cluster-node-tuning-operator/tuned-cxzv6" Apr 23 08:14:13.595622 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.595537 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fca14dff-5b1c-41d2-adc2-0d42ef722a54-tmp\") pod \"tuned-cxzv6\" (UID: \"fca14dff-5b1c-41d2-adc2-0d42ef722a54\") " pod="openshift-cluster-node-tuning-operator/tuned-cxzv6" Apr 23 08:14:13.595622 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.595562 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ffbe5334-bba8-45bc-bd64-2141ea3f49a8-host-kubelet\") pod \"ovnkube-node-vsvjc\" (UID: \"ffbe5334-bba8-45bc-bd64-2141ea3f49a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvjc" Apr 23 08:14:13.595622 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.595605 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/575d4e03-f407-47ca-9efc-8c7bee335d30-system-cni-dir\") pod \"multus-additional-cni-plugins-rctnw\" (UID: \"575d4e03-f407-47ca-9efc-8c7bee335d30\") " pod="openshift-multus/multus-additional-cni-plugins-rctnw" Apr 23 08:14:13.595622 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.595600 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ffbe5334-bba8-45bc-bd64-2141ea3f49a8-systemd-units\") pod \"ovnkube-node-vsvjc\" (UID: \"ffbe5334-bba8-45bc-bd64-2141ea3f49a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvjc" Apr 23 08:14:13.596039 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.595565 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/575d4e03-f407-47ca-9efc-8c7bee335d30-system-cni-dir\") pod \"multus-additional-cni-plugins-rctnw\" (UID: \"575d4e03-f407-47ca-9efc-8c7bee335d30\") " pod="openshift-multus/multus-additional-cni-plugins-rctnw" Apr 23 08:14:13.596039 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.595659 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/575d4e03-f407-47ca-9efc-8c7bee335d30-os-release\") pod \"multus-additional-cni-plugins-rctnw\" (UID: \"575d4e03-f407-47ca-9efc-8c7bee335d30\") " pod="openshift-multus/multus-additional-cni-plugins-rctnw" Apr 23 08:14:13.596039 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.595690 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/749e4605-b37e-4004-a2dc-68092884ddae-serviceca\") pod \"node-ca-fgmqg\" (UID: \"749e4605-b37e-4004-a2dc-68092884ddae\") " pod="openshift-image-registry/node-ca-fgmqg" Apr 23 08:14:13.596039 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.595713 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/575d4e03-f407-47ca-9efc-8c7bee335d30-cni-binary-copy\") pod \"multus-additional-cni-plugins-rctnw\" (UID: \"575d4e03-f407-47ca-9efc-8c7bee335d30\") " pod="openshift-multus/multus-additional-cni-plugins-rctnw" Apr 23 08:14:13.596039 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.595718 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/75c70767-827d-46b1-acb4-c76aff02f4bd-konnectivity-ca\") pod \"konnectivity-agent-z2ckf\" (UID: \"75c70767-827d-46b1-acb4-c76aff02f4bd\") " pod="kube-system/konnectivity-agent-z2ckf" Apr 23 08:14:13.596039 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.595663 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fca14dff-5b1c-41d2-adc2-0d42ef722a54-etc-kubernetes\") pod \"tuned-cxzv6\" (UID: \"fca14dff-5b1c-41d2-adc2-0d42ef722a54\") " pod="openshift-cluster-node-tuning-operator/tuned-cxzv6" Apr 23 08:14:13.596039 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.595772 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5da47bef-96ab-409c-80f2-3c5cb2356454-kubelet-dir\") pod \"aws-ebs-csi-driver-node-wdj9s\" (UID: \"5da47bef-96ab-409c-80f2-3c5cb2356454\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wdj9s" Apr 23 08:14:13.596039 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.595802 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8mfl4\" (UniqueName: \"kubernetes.io/projected/5e2ddc62-96a8-4827-87cc-a0b37c9a5e9f-kube-api-access-8mfl4\") pod \"network-metrics-daemon-4gzjb\" (UID: \"5e2ddc62-96a8-4827-87cc-a0b37c9a5e9f\") " pod="openshift-multus/network-metrics-daemon-4gzjb" Apr 23 08:14:13.596039 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.595826 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fca14dff-5b1c-41d2-adc2-0d42ef722a54-sys\") pod \"tuned-cxzv6\" (UID: \"fca14dff-5b1c-41d2-adc2-0d42ef722a54\") " pod="openshift-cluster-node-tuning-operator/tuned-cxzv6" Apr 23 08:14:13.596039 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.595850 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-29srk\" (UniqueName: \"kubernetes.io/projected/fca14dff-5b1c-41d2-adc2-0d42ef722a54-kube-api-access-29srk\") pod \"tuned-cxzv6\" (UID: \"fca14dff-5b1c-41d2-adc2-0d42ef722a54\") " pod="openshift-cluster-node-tuning-operator/tuned-cxzv6" Apr 23 08:14:13.596039 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.595878 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/887c87c3-07ae-4b74-aa64-fe19546746e0-iptables-alerter-script\") pod \"iptables-alerter-7wvqt\" (UID: \"887c87c3-07ae-4b74-aa64-fe19546746e0\") " pod="openshift-network-operator/iptables-alerter-7wvqt" Apr 23 08:14:13.596039 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.595907 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/575d4e03-f407-47ca-9efc-8c7bee335d30-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-rctnw\" (UID: \"575d4e03-f407-47ca-9efc-8c7bee335d30\") " pod="openshift-multus/multus-additional-cni-plugins-rctnw" Apr 23 08:14:13.596039 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.595945 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f033049a-1b87-451a-a1fc-53b7ebf036df-cni-binary-copy\") pod \"multus-hrjxz\" (UID: \"f033049a-1b87-451a-a1fc-53b7ebf036df\") " pod="openshift-multus/multus-hrjxz" Apr 23 08:14:13.596039 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.595969 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/fca14dff-5b1c-41d2-adc2-0d42ef722a54-etc-modprobe-d\") pod \"tuned-cxzv6\" (UID: \"fca14dff-5b1c-41d2-adc2-0d42ef722a54\") " pod="openshift-cluster-node-tuning-operator/tuned-cxzv6" Apr 23 08:14:13.596039 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.596007 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dswlg\" (UniqueName: \"kubernetes.io/projected/887c87c3-07ae-4b74-aa64-fe19546746e0-kube-api-access-dswlg\") pod \"iptables-alerter-7wvqt\" (UID: \"887c87c3-07ae-4b74-aa64-fe19546746e0\") " pod="openshift-network-operator/iptables-alerter-7wvqt" Apr 23 08:14:13.596039 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.596031 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/575d4e03-f407-47ca-9efc-8c7bee335d30-tuning-conf-dir\") pod \"multus-additional-cni-plugins-rctnw\" (UID: \"575d4e03-f407-47ca-9efc-8c7bee335d30\") " pod="openshift-multus/multus-additional-cni-plugins-rctnw" Apr 23 08:14:13.596706 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.596072 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f033049a-1b87-451a-a1fc-53b7ebf036df-multus-cni-dir\") pod \"multus-hrjxz\" (UID: \"f033049a-1b87-451a-a1fc-53b7ebf036df\") " pod="openshift-multus/multus-hrjxz" Apr 23 08:14:13.596706 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.596098 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f033049a-1b87-451a-a1fc-53b7ebf036df-host-run-multus-certs\") pod \"multus-hrjxz\" (UID: \"f033049a-1b87-451a-a1fc-53b7ebf036df\") " pod="openshift-multus/multus-hrjxz" Apr 23 08:14:13.596706 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.596120 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ffbe5334-bba8-45bc-bd64-2141ea3f49a8-var-lib-openvswitch\") pod \"ovnkube-node-vsvjc\" (UID: \"ffbe5334-bba8-45bc-bd64-2141ea3f49a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvjc" Apr 23 08:14:13.596706 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.596122 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/749e4605-b37e-4004-a2dc-68092884ddae-serviceca\") pod \"node-ca-fgmqg\" (UID: \"749e4605-b37e-4004-a2dc-68092884ddae\") " pod="openshift-image-registry/node-ca-fgmqg" Apr 23 08:14:13.596706 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.596142 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/fca14dff-5b1c-41d2-adc2-0d42ef722a54-run\") pod \"tuned-cxzv6\" (UID: \"fca14dff-5b1c-41d2-adc2-0d42ef722a54\") " pod="openshift-cluster-node-tuning-operator/tuned-cxzv6" Apr 23 08:14:13.596706 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.596169 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/887c87c3-07ae-4b74-aa64-fe19546746e0-host-slash\") pod \"iptables-alerter-7wvqt\" (UID: \"887c87c3-07ae-4b74-aa64-fe19546746e0\") " pod="openshift-network-operator/iptables-alerter-7wvqt" Apr 23 08:14:13.596706 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.596222 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/575d4e03-f407-47ca-9efc-8c7bee335d30-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-rctnw\" (UID: \"575d4e03-f407-47ca-9efc-8c7bee335d30\") " pod="openshift-multus/multus-additional-cni-plugins-rctnw" Apr 23 08:14:13.596706 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.596238 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/75c70767-827d-46b1-acb4-c76aff02f4bd-konnectivity-ca\") pod \"konnectivity-agent-z2ckf\" (UID: \"75c70767-827d-46b1-acb4-c76aff02f4bd\") " pod="kube-system/konnectivity-agent-z2ckf" Apr 23 08:14:13.596706 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.596247 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/749e4605-b37e-4004-a2dc-68092884ddae-host\") pod \"node-ca-fgmqg\" (UID: \"749e4605-b37e-4004-a2dc-68092884ddae\") " pod="openshift-image-registry/node-ca-fgmqg" Apr 23 08:14:13.596706 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.596279 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f033049a-1b87-451a-a1fc-53b7ebf036df-hostroot\") pod \"multus-hrjxz\" (UID: \"f033049a-1b87-451a-a1fc-53b7ebf036df\") " pod="openshift-multus/multus-hrjxz" Apr 23 08:14:13.596706 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.596308 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f033049a-1b87-451a-a1fc-53b7ebf036df-multus-conf-dir\") pod \"multus-hrjxz\" (UID: \"f033049a-1b87-451a-a1fc-53b7ebf036df\") " pod="openshift-multus/multus-hrjxz" Apr 23 08:14:13.596706 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.596333 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ffbe5334-bba8-45bc-bd64-2141ea3f49a8-host-run-netns\") pod \"ovnkube-node-vsvjc\" (UID: \"ffbe5334-bba8-45bc-bd64-2141ea3f49a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvjc" Apr 23 08:14:13.596706 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.596356 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ffbe5334-bba8-45bc-bd64-2141ea3f49a8-etc-openvswitch\") pod \"ovnkube-node-vsvjc\" (UID: \"ffbe5334-bba8-45bc-bd64-2141ea3f49a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvjc" Apr 23 08:14:13.596706 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.596383 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f033049a-1b87-451a-a1fc-53b7ebf036df-system-cni-dir\") pod \"multus-hrjxz\" (UID: \"f033049a-1b87-451a-a1fc-53b7ebf036df\") " pod="openshift-multus/multus-hrjxz" Apr 23 08:14:13.596706 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.596386 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ffbe5334-bba8-45bc-bd64-2141ea3f49a8-var-lib-openvswitch\") pod \"ovnkube-node-vsvjc\" (UID: \"ffbe5334-bba8-45bc-bd64-2141ea3f49a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvjc" Apr 23 08:14:13.596706 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.596407 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f033049a-1b87-451a-a1fc-53b7ebf036df-host-run-k8s-cni-cncf-io\") pod \"multus-hrjxz\" (UID: \"f033049a-1b87-451a-a1fc-53b7ebf036df\") " pod="openshift-multus/multus-hrjxz" Apr 23 08:14:13.596706 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.596440 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f033049a-1b87-451a-a1fc-53b7ebf036df-host-var-lib-cni-multus\") pod \"multus-hrjxz\" (UID: \"f033049a-1b87-451a-a1fc-53b7ebf036df\") " pod="openshift-multus/multus-hrjxz" Apr 23 08:14:13.596706 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.596464 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f033049a-1b87-451a-a1fc-53b7ebf036df-host-var-lib-kubelet\") pod \"multus-hrjxz\" (UID: \"f033049a-1b87-451a-a1fc-53b7ebf036df\") " pod="openshift-multus/multus-hrjxz" Apr 23 08:14:13.597574 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.595737 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/575d4e03-f407-47ca-9efc-8c7bee335d30-os-release\") pod \"multus-additional-cni-plugins-rctnw\" (UID: \"575d4e03-f407-47ca-9efc-8c7bee335d30\") " pod="openshift-multus/multus-additional-cni-plugins-rctnw" Apr 23 08:14:13.597574 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.596489 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ffbe5334-bba8-45bc-bd64-2141ea3f49a8-log-socket\") pod \"ovnkube-node-vsvjc\" (UID: \"ffbe5334-bba8-45bc-bd64-2141ea3f49a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvjc" Apr 23 08:14:13.597574 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.596543 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5da47bef-96ab-409c-80f2-3c5cb2356454-kubelet-dir\") pod \"aws-ebs-csi-driver-node-wdj9s\" (UID: \"5da47bef-96ab-409c-80f2-3c5cb2356454\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wdj9s" Apr 23 08:14:13.597574 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.596530 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ffbe5334-bba8-45bc-bd64-2141ea3f49a8-env-overrides\") pod \"ovnkube-node-vsvjc\" (UID: \"ffbe5334-bba8-45bc-bd64-2141ea3f49a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvjc" Apr 23 08:14:13.597574 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.596598 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fca14dff-5b1c-41d2-adc2-0d42ef722a54-lib-modules\") pod \"tuned-cxzv6\" (UID: \"fca14dff-5b1c-41d2-adc2-0d42ef722a54\") " pod="openshift-cluster-node-tuning-operator/tuned-cxzv6" Apr 23 08:14:13.597574 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.596627 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ffbe5334-bba8-45bc-bd64-2141ea3f49a8-node-log\") pod \"ovnkube-node-vsvjc\" (UID: \"ffbe5334-bba8-45bc-bd64-2141ea3f49a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvjc" Apr 23 08:14:13.597574 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.596656 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ffbe5334-bba8-45bc-bd64-2141ea3f49a8-ovnkube-config\") pod \"ovnkube-node-vsvjc\" (UID: \"ffbe5334-bba8-45bc-bd64-2141ea3f49a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvjc" Apr 23 08:14:13.597574 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.596689 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qct9v\" (UniqueName: \"kubernetes.io/projected/9f4d9f29-8efc-4799-b966-20f9d049fd33-kube-api-access-qct9v\") pod \"node-resolver-mslvd\" (UID: \"9f4d9f29-8efc-4799-b966-20f9d049fd33\") " pod="openshift-dns/node-resolver-mslvd" Apr 23 08:14:13.597574 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.596716 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tsgfg\" (UniqueName: \"kubernetes.io/projected/93ac6a8e-10aa-4687-be43-6d712bee9ebd-kube-api-access-tsgfg\") pod \"network-check-target-wrt6p\" (UID: \"93ac6a8e-10aa-4687-be43-6d712bee9ebd\") " pod="openshift-network-diagnostics/network-check-target-wrt6p" Apr 23 08:14:13.597574 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.596743 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/fca14dff-5b1c-41d2-adc2-0d42ef722a54-etc-sysconfig\") pod \"tuned-cxzv6\" (UID: \"fca14dff-5b1c-41d2-adc2-0d42ef722a54\") " pod="openshift-cluster-node-tuning-operator/tuned-cxzv6" Apr 23 08:14:13.597574 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.596768 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5e2ddc62-96a8-4827-87cc-a0b37c9a5e9f-metrics-certs\") pod \"network-metrics-daemon-4gzjb\" (UID: \"5e2ddc62-96a8-4827-87cc-a0b37c9a5e9f\") " pod="openshift-multus/network-metrics-daemon-4gzjb" Apr 23 08:14:13.597574 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.596796 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/5da47bef-96ab-409c-80f2-3c5cb2356454-registration-dir\") pod \"aws-ebs-csi-driver-node-wdj9s\" (UID: \"5da47bef-96ab-409c-80f2-3c5cb2356454\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wdj9s" Apr 23 08:14:13.597574 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.596825 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/5da47bef-96ab-409c-80f2-3c5cb2356454-etc-selinux\") pod \"aws-ebs-csi-driver-node-wdj9s\" (UID: \"5da47bef-96ab-409c-80f2-3c5cb2356454\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wdj9s" Apr 23 08:14:13.597574 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.596810 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fca14dff-5b1c-41d2-adc2-0d42ef722a54-sys\") pod \"tuned-cxzv6\" (UID: \"fca14dff-5b1c-41d2-adc2-0d42ef722a54\") " pod="openshift-cluster-node-tuning-operator/tuned-cxzv6" Apr 23 08:14:13.597574 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.596855 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f033049a-1b87-451a-a1fc-53b7ebf036df-cnibin\") pod \"multus-hrjxz\" (UID: \"f033049a-1b87-451a-a1fc-53b7ebf036df\") " pod="openshift-multus/multus-hrjxz" Apr 23 08:14:13.597574 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.596854 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f033049a-1b87-451a-a1fc-53b7ebf036df-cni-binary-copy\") pod \"multus-hrjxz\" (UID: \"f033049a-1b87-451a-a1fc-53b7ebf036df\") " pod="openshift-multus/multus-hrjxz" Apr 23 08:14:13.597574 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.596878 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ffbe5334-bba8-45bc-bd64-2141ea3f49a8-host-slash\") pod \"ovnkube-node-vsvjc\" (UID: \"ffbe5334-bba8-45bc-bd64-2141ea3f49a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvjc" Apr 23 08:14:13.598321 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.596901 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ffbe5334-bba8-45bc-bd64-2141ea3f49a8-run-systemd\") pod \"ovnkube-node-vsvjc\" (UID: \"ffbe5334-bba8-45bc-bd64-2141ea3f49a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvjc" Apr 23 08:14:13.598321 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.596924 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/575d4e03-f407-47ca-9efc-8c7bee335d30-cnibin\") pod \"multus-additional-cni-plugins-rctnw\" (UID: \"575d4e03-f407-47ca-9efc-8c7bee335d30\") " pod="openshift-multus/multus-additional-cni-plugins-rctnw" Apr 23 08:14:13.598321 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.596928 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f033049a-1b87-451a-a1fc-53b7ebf036df-multus-cni-dir\") pod \"multus-hrjxz\" (UID: \"f033049a-1b87-451a-a1fc-53b7ebf036df\") " pod="openshift-multus/multus-hrjxz" Apr 23 08:14:13.598321 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.596956 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-csb9l\" (UniqueName: \"kubernetes.io/projected/749e4605-b37e-4004-a2dc-68092884ddae-kube-api-access-csb9l\") pod \"node-ca-fgmqg\" (UID: \"749e4605-b37e-4004-a2dc-68092884ddae\") " pod="openshift-image-registry/node-ca-fgmqg" Apr 23 08:14:13.598321 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.596966 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f033049a-1b87-451a-a1fc-53b7ebf036df-host-run-multus-certs\") pod \"multus-hrjxz\" (UID: \"f033049a-1b87-451a-a1fc-53b7ebf036df\") " pod="openshift-multus/multus-hrjxz" Apr 23 08:14:13.598321 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.596981 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ffbe5334-bba8-45bc-bd64-2141ea3f49a8-run-ovn\") pod \"ovnkube-node-vsvjc\" (UID: \"ffbe5334-bba8-45bc-bd64-2141ea3f49a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvjc" Apr 23 08:14:13.598321 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.597006 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ffbe5334-bba8-45bc-bd64-2141ea3f49a8-host-cni-bin\") pod \"ovnkube-node-vsvjc\" (UID: \"ffbe5334-bba8-45bc-bd64-2141ea3f49a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvjc" Apr 23 08:14:13.598321 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.597062 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ffbe5334-bba8-45bc-bd64-2141ea3f49a8-host-cni-bin\") pod \"ovnkube-node-vsvjc\" (UID: \"ffbe5334-bba8-45bc-bd64-2141ea3f49a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvjc" Apr 23 08:14:13.598321 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.597118 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/fca14dff-5b1c-41d2-adc2-0d42ef722a54-run\") pod \"tuned-cxzv6\" (UID: \"fca14dff-5b1c-41d2-adc2-0d42ef722a54\") " pod="openshift-cluster-node-tuning-operator/tuned-cxzv6" Apr 23 08:14:13.598321 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.597153 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/887c87c3-07ae-4b74-aa64-fe19546746e0-host-slash\") pod \"iptables-alerter-7wvqt\" (UID: \"887c87c3-07ae-4b74-aa64-fe19546746e0\") " pod="openshift-network-operator/iptables-alerter-7wvqt" Apr 23 08:14:13.598321 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.597163 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/575d4e03-f407-47ca-9efc-8c7bee335d30-tuning-conf-dir\") pod \"multus-additional-cni-plugins-rctnw\" (UID: \"575d4e03-f407-47ca-9efc-8c7bee335d30\") " pod="openshift-multus/multus-additional-cni-plugins-rctnw" Apr 23 08:14:13.598321 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.597218 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/fca14dff-5b1c-41d2-adc2-0d42ef722a54-etc-modprobe-d\") pod \"tuned-cxzv6\" (UID: \"fca14dff-5b1c-41d2-adc2-0d42ef722a54\") " pod="openshift-cluster-node-tuning-operator/tuned-cxzv6" Apr 23 08:14:13.598321 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.597299 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ffbe5334-bba8-45bc-bd64-2141ea3f49a8-node-log\") pod \"ovnkube-node-vsvjc\" (UID: \"ffbe5334-bba8-45bc-bd64-2141ea3f49a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvjc" Apr 23 08:14:13.598321 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.597339 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/749e4605-b37e-4004-a2dc-68092884ddae-host\") pod \"node-ca-fgmqg\" (UID: \"749e4605-b37e-4004-a2dc-68092884ddae\") " pod="openshift-image-registry/node-ca-fgmqg" Apr 23 08:14:13.598321 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.597404 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f033049a-1b87-451a-a1fc-53b7ebf036df-hostroot\") pod \"multus-hrjxz\" (UID: \"f033049a-1b87-451a-a1fc-53b7ebf036df\") " pod="openshift-multus/multus-hrjxz" Apr 23 08:14:13.598321 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.597450 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ffbe5334-bba8-45bc-bd64-2141ea3f49a8-host-run-netns\") pod \"ovnkube-node-vsvjc\" (UID: \"ffbe5334-bba8-45bc-bd64-2141ea3f49a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvjc" Apr 23 08:14:13.598321 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.597463 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/887c87c3-07ae-4b74-aa64-fe19546746e0-iptables-alerter-script\") pod \"iptables-alerter-7wvqt\" (UID: \"887c87c3-07ae-4b74-aa64-fe19546746e0\") " pod="openshift-network-operator/iptables-alerter-7wvqt" Apr 23 08:14:13.598321 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.597492 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ffbe5334-bba8-45bc-bd64-2141ea3f49a8-etc-openvswitch\") pod \"ovnkube-node-vsvjc\" (UID: \"ffbe5334-bba8-45bc-bd64-2141ea3f49a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvjc" Apr 23 08:14:13.599123 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.597505 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f033049a-1b87-451a-a1fc-53b7ebf036df-multus-socket-dir-parent\") pod \"multus-hrjxz\" (UID: \"f033049a-1b87-451a-a1fc-53b7ebf036df\") " pod="openshift-multus/multus-hrjxz" Apr 23 08:14:13.599123 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.597565 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ffbe5334-bba8-45bc-bd64-2141ea3f49a8-host-run-ovn-kubernetes\") pod \"ovnkube-node-vsvjc\" (UID: \"ffbe5334-bba8-45bc-bd64-2141ea3f49a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvjc" Apr 23 08:14:13.599123 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.597573 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f033049a-1b87-451a-a1fc-53b7ebf036df-multus-socket-dir-parent\") pod \"multus-hrjxz\" (UID: \"f033049a-1b87-451a-a1fc-53b7ebf036df\") " pod="openshift-multus/multus-hrjxz" Apr 23 08:14:13.599123 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.597593 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f033049a-1b87-451a-a1fc-53b7ebf036df-host-var-lib-cni-bin\") pod \"multus-hrjxz\" (UID: \"f033049a-1b87-451a-a1fc-53b7ebf036df\") " pod="openshift-multus/multus-hrjxz" Apr 23 08:14:13.599123 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.597612 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f033049a-1b87-451a-a1fc-53b7ebf036df-multus-conf-dir\") pod \"multus-hrjxz\" (UID: \"f033049a-1b87-451a-a1fc-53b7ebf036df\") " pod="openshift-multus/multus-hrjxz" Apr 23 08:14:13.599123 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.597619 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/fca14dff-5b1c-41d2-adc2-0d42ef722a54-etc-tuned\") pod \"tuned-cxzv6\" (UID: \"fca14dff-5b1c-41d2-adc2-0d42ef722a54\") " pod="openshift-cluster-node-tuning-operator/tuned-cxzv6" Apr 23 08:14:13.599123 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.597649 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9f4d9f29-8efc-4799-b966-20f9d049fd33-tmp-dir\") pod \"node-resolver-mslvd\" (UID: \"9f4d9f29-8efc-4799-b966-20f9d049fd33\") " pod="openshift-dns/node-resolver-mslvd" Apr 23 08:14:13.599123 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.597660 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f033049a-1b87-451a-a1fc-53b7ebf036df-cnibin\") pod \"multus-hrjxz\" (UID: \"f033049a-1b87-451a-a1fc-53b7ebf036df\") " pod="openshift-multus/multus-hrjxz" Apr 23 08:14:13.599123 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.597673 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/75c70767-827d-46b1-acb4-c76aff02f4bd-agent-certs\") pod \"konnectivity-agent-z2ckf\" (UID: \"75c70767-827d-46b1-acb4-c76aff02f4bd\") " pod="kube-system/konnectivity-agent-z2ckf" Apr 23 08:14:13.599123 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.597699 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ffbe5334-bba8-45bc-bd64-2141ea3f49a8-run-openvswitch\") pod \"ovnkube-node-vsvjc\" (UID: \"ffbe5334-bba8-45bc-bd64-2141ea3f49a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvjc" Apr 23 08:14:13.599123 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.597732 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ffbe5334-bba8-45bc-bd64-2141ea3f49a8-run-systemd\") pod \"ovnkube-node-vsvjc\" (UID: \"ffbe5334-bba8-45bc-bd64-2141ea3f49a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvjc" Apr 23 08:14:13.599123 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.597797 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ffbe5334-bba8-45bc-bd64-2141ea3f49a8-run-ovn\") pod \"ovnkube-node-vsvjc\" (UID: \"ffbe5334-bba8-45bc-bd64-2141ea3f49a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvjc" Apr 23 08:14:13.599123 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:13.598282 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:14:13.599123 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:13.598406 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5e2ddc62-96a8-4827-87cc-a0b37c9a5e9f-metrics-certs podName:5e2ddc62-96a8-4827-87cc-a0b37c9a5e9f nodeName:}" failed. No retries permitted until 2026-04-23 08:14:14.098370276 +0000 UTC m=+3.046557501 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5e2ddc62-96a8-4827-87cc-a0b37c9a5e9f-metrics-certs") pod "network-metrics-daemon-4gzjb" (UID: "5e2ddc62-96a8-4827-87cc-a0b37c9a5e9f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:14:13.599123 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.598726 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f033049a-1b87-451a-a1fc-53b7ebf036df-system-cni-dir\") pod \"multus-hrjxz\" (UID: \"f033049a-1b87-451a-a1fc-53b7ebf036df\") " pod="openshift-multus/multus-hrjxz" Apr 23 08:14:13.599123 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.598918 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ffbe5334-bba8-45bc-bd64-2141ea3f49a8-ovnkube-config\") pod \"ovnkube-node-vsvjc\" (UID: \"ffbe5334-bba8-45bc-bd64-2141ea3f49a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvjc" Apr 23 08:14:13.599123 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.599061 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/575d4e03-f407-47ca-9efc-8c7bee335d30-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-rctnw\" (UID: \"575d4e03-f407-47ca-9efc-8c7bee335d30\") " pod="openshift-multus/multus-additional-cni-plugins-rctnw" Apr 23 08:14:13.599888 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.599377 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f033049a-1b87-451a-a1fc-53b7ebf036df-host-var-lib-cni-bin\") pod \"multus-hrjxz\" (UID: \"f033049a-1b87-451a-a1fc-53b7ebf036df\") " pod="openshift-multus/multus-hrjxz" Apr 23 08:14:13.599888 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.599423 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f033049a-1b87-451a-a1fc-53b7ebf036df-host-run-k8s-cni-cncf-io\") pod \"multus-hrjxz\" (UID: \"f033049a-1b87-451a-a1fc-53b7ebf036df\") " pod="openshift-multus/multus-hrjxz" Apr 23 08:14:13.599888 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.599548 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f033049a-1b87-451a-a1fc-53b7ebf036df-host-var-lib-cni-multus\") pod \"multus-hrjxz\" (UID: \"f033049a-1b87-451a-a1fc-53b7ebf036df\") " pod="openshift-multus/multus-hrjxz" Apr 23 08:14:13.599888 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.599582 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f033049a-1b87-451a-a1fc-53b7ebf036df-host-var-lib-kubelet\") pod \"multus-hrjxz\" (UID: \"f033049a-1b87-451a-a1fc-53b7ebf036df\") " pod="openshift-multus/multus-hrjxz" Apr 23 08:14:13.599888 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.599685 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ffbe5334-bba8-45bc-bd64-2141ea3f49a8-log-socket\") pod \"ovnkube-node-vsvjc\" (UID: \"ffbe5334-bba8-45bc-bd64-2141ea3f49a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvjc" Apr 23 08:14:13.599888 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.597504 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/5da47bef-96ab-409c-80f2-3c5cb2356454-etc-selinux\") pod \"aws-ebs-csi-driver-node-wdj9s\" (UID: \"5da47bef-96ab-409c-80f2-3c5cb2356454\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wdj9s" Apr 23 08:14:13.600158 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.599997 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ffbe5334-bba8-45bc-bd64-2141ea3f49a8-ovn-node-metrics-cert\") pod \"ovnkube-node-vsvjc\" (UID: \"ffbe5334-bba8-45bc-bd64-2141ea3f49a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvjc" Apr 23 08:14:13.602065 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.597612 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/575d4e03-f407-47ca-9efc-8c7bee335d30-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-rctnw\" (UID: \"575d4e03-f407-47ca-9efc-8c7bee335d30\") " pod="openshift-multus/multus-additional-cni-plugins-rctnw" Apr 23 08:14:13.602065 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.597699 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ffbe5334-bba8-45bc-bd64-2141ea3f49a8-host-slash\") pod \"ovnkube-node-vsvjc\" (UID: \"ffbe5334-bba8-45bc-bd64-2141ea3f49a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvjc" Apr 23 08:14:13.602065 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.600981 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fca14dff-5b1c-41d2-adc2-0d42ef722a54-lib-modules\") pod \"tuned-cxzv6\" (UID: \"fca14dff-5b1c-41d2-adc2-0d42ef722a54\") " pod="openshift-cluster-node-tuning-operator/tuned-cxzv6" Apr 23 08:14:13.602065 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.601086 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ffbe5334-bba8-45bc-bd64-2141ea3f49a8-host-run-ovn-kubernetes\") pod \"ovnkube-node-vsvjc\" (UID: \"ffbe5334-bba8-45bc-bd64-2141ea3f49a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvjc" Apr 23 08:14:13.602065 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.601179 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ffbe5334-bba8-45bc-bd64-2141ea3f49a8-env-overrides\") pod \"ovnkube-node-vsvjc\" (UID: \"ffbe5334-bba8-45bc-bd64-2141ea3f49a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvjc" Apr 23 08:14:13.602065 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.601184 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/5da47bef-96ab-409c-80f2-3c5cb2356454-registration-dir\") pod \"aws-ebs-csi-driver-node-wdj9s\" (UID: \"5da47bef-96ab-409c-80f2-3c5cb2356454\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wdj9s" Apr 23 08:14:13.602680 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.602658 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hg6zz\" (UniqueName: \"kubernetes.io/projected/ffbe5334-bba8-45bc-bd64-2141ea3f49a8-kube-api-access-hg6zz\") pod \"ovnkube-node-vsvjc\" (UID: \"ffbe5334-bba8-45bc-bd64-2141ea3f49a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvjc" Apr 23 08:14:13.603210 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.602780 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fca14dff-5b1c-41d2-adc2-0d42ef722a54-tmp\") pod \"tuned-cxzv6\" (UID: \"fca14dff-5b1c-41d2-adc2-0d42ef722a54\") " pod="openshift-cluster-node-tuning-operator/tuned-cxzv6" Apr 23 08:14:13.603304 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.603243 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/fca14dff-5b1c-41d2-adc2-0d42ef722a54-etc-sysconfig\") pod \"tuned-cxzv6\" (UID: \"fca14dff-5b1c-41d2-adc2-0d42ef722a54\") " pod="openshift-cluster-node-tuning-operator/tuned-cxzv6" Apr 23 08:14:13.603359 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.603325 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/575d4e03-f407-47ca-9efc-8c7bee335d30-cnibin\") pod \"multus-additional-cni-plugins-rctnw\" (UID: \"575d4e03-f407-47ca-9efc-8c7bee335d30\") " pod="openshift-multus/multus-additional-cni-plugins-rctnw" Apr 23 08:14:13.604258 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:13.604235 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 08:14:13.604346 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:13.604267 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 08:14:13.604346 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:13.604284 2570 projected.go:194] Error preparing data for projected volume kube-api-access-tsgfg for pod openshift-network-diagnostics/network-check-target-wrt6p: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:14:13.604470 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:13.604353 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/93ac6a8e-10aa-4687-be43-6d712bee9ebd-kube-api-access-tsgfg podName:93ac6a8e-10aa-4687-be43-6d712bee9ebd nodeName:}" failed. No retries permitted until 2026-04-23 08:14:14.10433649 +0000 UTC m=+3.052523701 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-tsgfg" (UniqueName: "kubernetes.io/projected/93ac6a8e-10aa-4687-be43-6d712bee9ebd-kube-api-access-tsgfg") pod "network-check-target-wrt6p" (UID: "93ac6a8e-10aa-4687-be43-6d712bee9ebd") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:14:13.604470 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.604238 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lg4tp\" (UniqueName: \"kubernetes.io/projected/575d4e03-f407-47ca-9efc-8c7bee335d30-kube-api-access-lg4tp\") pod \"multus-additional-cni-plugins-rctnw\" (UID: \"575d4e03-f407-47ca-9efc-8c7bee335d30\") " pod="openshift-multus/multus-additional-cni-plugins-rctnw" Apr 23 08:14:13.604470 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.604426 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbfh5\" (UniqueName: \"kubernetes.io/projected/f033049a-1b87-451a-a1fc-53b7ebf036df-kube-api-access-sbfh5\") pod \"multus-hrjxz\" (UID: \"f033049a-1b87-451a-a1fc-53b7ebf036df\") " pod="openshift-multus/multus-hrjxz" Apr 23 08:14:13.604652 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.603698 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ffbe5334-bba8-45bc-bd64-2141ea3f49a8-run-openvswitch\") pod \"ovnkube-node-vsvjc\" (UID: \"ffbe5334-bba8-45bc-bd64-2141ea3f49a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvjc" Apr 23 08:14:13.606665 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.606637 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/75c70767-827d-46b1-acb4-c76aff02f4bd-agent-certs\") pod \"konnectivity-agent-z2ckf\" (UID: \"75c70767-827d-46b1-acb4-c76aff02f4bd\") " pod="kube-system/konnectivity-agent-z2ckf" Apr 23 08:14:13.606760 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.606692 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mfl4\" (UniqueName: \"kubernetes.io/projected/5e2ddc62-96a8-4827-87cc-a0b37c9a5e9f-kube-api-access-8mfl4\") pod \"network-metrics-daemon-4gzjb\" (UID: \"5e2ddc62-96a8-4827-87cc-a0b37c9a5e9f\") " pod="openshift-multus/network-metrics-daemon-4gzjb" Apr 23 08:14:13.606760 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.606717 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dswlg\" (UniqueName: \"kubernetes.io/projected/887c87c3-07ae-4b74-aa64-fe19546746e0-kube-api-access-dswlg\") pod \"iptables-alerter-7wvqt\" (UID: \"887c87c3-07ae-4b74-aa64-fe19546746e0\") " pod="openshift-network-operator/iptables-alerter-7wvqt" Apr 23 08:14:13.607245 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.607099 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvqth\" (UniqueName: \"kubernetes.io/projected/5da47bef-96ab-409c-80f2-3c5cb2356454-kube-api-access-tvqth\") pod \"aws-ebs-csi-driver-node-wdj9s\" (UID: \"5da47bef-96ab-409c-80f2-3c5cb2356454\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wdj9s" Apr 23 08:14:13.607834 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.607806 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-29srk\" (UniqueName: \"kubernetes.io/projected/fca14dff-5b1c-41d2-adc2-0d42ef722a54-kube-api-access-29srk\") pod \"tuned-cxzv6\" (UID: \"fca14dff-5b1c-41d2-adc2-0d42ef722a54\") " pod="openshift-cluster-node-tuning-operator/tuned-cxzv6" Apr 23 08:14:13.608417 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.608394 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/fca14dff-5b1c-41d2-adc2-0d42ef722a54-etc-tuned\") pod \"tuned-cxzv6\" (UID: \"fca14dff-5b1c-41d2-adc2-0d42ef722a54\") " pod="openshift-cluster-node-tuning-operator/tuned-cxzv6" Apr 23 08:14:13.614517 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.614496 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-csb9l\" (UniqueName: \"kubernetes.io/projected/749e4605-b37e-4004-a2dc-68092884ddae-kube-api-access-csb9l\") pod \"node-ca-fgmqg\" (UID: \"749e4605-b37e-4004-a2dc-68092884ddae\") " pod="openshift-image-registry/node-ca-fgmqg" Apr 23 08:14:13.698503 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.698408 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9f4d9f29-8efc-4799-b966-20f9d049fd33-hosts-file\") pod \"node-resolver-mslvd\" (UID: \"9f4d9f29-8efc-4799-b966-20f9d049fd33\") " pod="openshift-dns/node-resolver-mslvd" Apr 23 08:14:13.698503 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.698476 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qct9v\" (UniqueName: \"kubernetes.io/projected/9f4d9f29-8efc-4799-b966-20f9d049fd33-kube-api-access-qct9v\") pod \"node-resolver-mslvd\" (UID: \"9f4d9f29-8efc-4799-b966-20f9d049fd33\") " pod="openshift-dns/node-resolver-mslvd" Apr 23 08:14:13.698738 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.698533 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9f4d9f29-8efc-4799-b966-20f9d049fd33-tmp-dir\") pod \"node-resolver-mslvd\" (UID: \"9f4d9f29-8efc-4799-b966-20f9d049fd33\") " pod="openshift-dns/node-resolver-mslvd" Apr 23 08:14:13.698738 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.698552 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9f4d9f29-8efc-4799-b966-20f9d049fd33-hosts-file\") pod \"node-resolver-mslvd\" (UID: \"9f4d9f29-8efc-4799-b966-20f9d049fd33\") " pod="openshift-dns/node-resolver-mslvd" Apr 23 08:14:13.698863 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.698835 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9f4d9f29-8efc-4799-b966-20f9d049fd33-tmp-dir\") pod \"node-resolver-mslvd\" (UID: \"9f4d9f29-8efc-4799-b966-20f9d049fd33\") " pod="openshift-dns/node-resolver-mslvd" Apr 23 08:14:13.706856 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.706828 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qct9v\" (UniqueName: \"kubernetes.io/projected/9f4d9f29-8efc-4799-b966-20f9d049fd33-kube-api-access-qct9v\") pod \"node-resolver-mslvd\" (UID: \"9f4d9f29-8efc-4799-b966-20f9d049fd33\") " pod="openshift-dns/node-resolver-mslvd" Apr 23 08:14:13.782743 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.782710 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-7wvqt" Apr 23 08:14:13.790744 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.790717 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-rctnw" Apr 23 08:14:13.799415 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.799392 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-fgmqg" Apr 23 08:14:13.804854 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.804829 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-hrjxz" Apr 23 08:14:13.810535 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.810513 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vsvjc" Apr 23 08:14:13.817158 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.817136 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-z2ckf" Apr 23 08:14:13.823725 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.823704 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wdj9s" Apr 23 08:14:13.829221 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.829186 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-cxzv6" Apr 23 08:14:13.834712 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:13.834694 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-mslvd" Apr 23 08:14:14.101946 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:14.101855 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5e2ddc62-96a8-4827-87cc-a0b37c9a5e9f-metrics-certs\") pod \"network-metrics-daemon-4gzjb\" (UID: \"5e2ddc62-96a8-4827-87cc-a0b37c9a5e9f\") " pod="openshift-multus/network-metrics-daemon-4gzjb" Apr 23 08:14:14.102110 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:14.102025 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:14:14.102178 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:14.102112 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5e2ddc62-96a8-4827-87cc-a0b37c9a5e9f-metrics-certs podName:5e2ddc62-96a8-4827-87cc-a0b37c9a5e9f nodeName:}" failed. No retries permitted until 2026-04-23 08:14:15.102091329 +0000 UTC m=+4.050278536 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5e2ddc62-96a8-4827-87cc-a0b37c9a5e9f-metrics-certs") pod "network-metrics-daemon-4gzjb" (UID: "5e2ddc62-96a8-4827-87cc-a0b37c9a5e9f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:14:14.180128 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:14.179914 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod749e4605_b37e_4004_a2dc_68092884ddae.slice/crio-c5230c219aa01f4d7966f68e89cb9fef181212b9d6e85d1de4949ba0ee18f401 WatchSource:0}: Error finding container c5230c219aa01f4d7966f68e89cb9fef181212b9d6e85d1de4949ba0ee18f401: Status 404 returned error can't find the container with id c5230c219aa01f4d7966f68e89cb9fef181212b9d6e85d1de4949ba0ee18f401 Apr 23 08:14:14.181787 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:14.181762 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75c70767_827d_46b1_acb4_c76aff02f4bd.slice/crio-d857ebf5755c65fc5568f186719b43b446261746e6cabd5f73a75ac57d563074 WatchSource:0}: Error finding container d857ebf5755c65fc5568f186719b43b446261746e6cabd5f73a75ac57d563074: Status 404 returned error can't find the container with id d857ebf5755c65fc5568f186719b43b446261746e6cabd5f73a75ac57d563074 Apr 23 08:14:14.182811 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:14.182779 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podffbe5334_bba8_45bc_bd64_2141ea3f49a8.slice/crio-7d74df29ae0342eb3a997d4a782ae489cfd12f77e592ce7cfe4a49fd78fd41a6 WatchSource:0}: Error finding container 7d74df29ae0342eb3a997d4a782ae489cfd12f77e592ce7cfe4a49fd78fd41a6: Status 404 returned error can't find the container with id 7d74df29ae0342eb3a997d4a782ae489cfd12f77e592ce7cfe4a49fd78fd41a6 Apr 23 08:14:14.183955 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:14.183903 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod575d4e03_f407_47ca_9efc_8c7bee335d30.slice/crio-1857495b099f6f950a18760268b9f8a0a788b59115fa3940cf695c766fab3a17 WatchSource:0}: Error finding container 1857495b099f6f950a18760268b9f8a0a788b59115fa3940cf695c766fab3a17: Status 404 returned error can't find the container with id 1857495b099f6f950a18760268b9f8a0a788b59115fa3940cf695c766fab3a17 Apr 23 08:14:14.188239 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:14.188091 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfca14dff_5b1c_41d2_adc2_0d42ef722a54.slice/crio-a94ddb0ecba308d741f4f7b3b87fa7090681759d3ecf48aff86db797e7cd0c9a WatchSource:0}: Error finding container a94ddb0ecba308d741f4f7b3b87fa7090681759d3ecf48aff86db797e7cd0c9a: Status 404 returned error can't find the container with id a94ddb0ecba308d741f4f7b3b87fa7090681759d3ecf48aff86db797e7cd0c9a Apr 23 08:14:14.189684 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:14.189598 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf033049a_1b87_451a_a1fc_53b7ebf036df.slice/crio-a7effd0610153710a285cddab3cc0e393cffa92e74487d8dd2d13c5449bae937 WatchSource:0}: Error finding container a7effd0610153710a285cddab3cc0e393cffa92e74487d8dd2d13c5449bae937: Status 404 returned error can't find the container with id a7effd0610153710a285cddab3cc0e393cffa92e74487d8dd2d13c5449bae937 Apr 23 08:14:14.190523 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:14.190499 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod887c87c3_07ae_4b74_aa64_fe19546746e0.slice/crio-c3767c2084253b716b86174ded14c2a8a47cadcd5c834dac5f2844cf87e0c2ce WatchSource:0}: Error finding container c3767c2084253b716b86174ded14c2a8a47cadcd5c834dac5f2844cf87e0c2ce: Status 404 returned error can't find the container with id c3767c2084253b716b86174ded14c2a8a47cadcd5c834dac5f2844cf87e0c2ce Apr 23 08:14:14.191599 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:14.191399 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f4d9f29_8efc_4799_b966_20f9d049fd33.slice/crio-59108e484ef363806731651d5924cb2f2961ba809c543a2d834d7ccd653cbbd4 WatchSource:0}: Error finding container 59108e484ef363806731651d5924cb2f2961ba809c543a2d834d7ccd653cbbd4: Status 404 returned error can't find the container with id 59108e484ef363806731651d5924cb2f2961ba809c543a2d834d7ccd653cbbd4 Apr 23 08:14:14.192150 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:14.192098 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5da47bef_96ab_409c_80f2_3c5cb2356454.slice/crio-469dd771c9a61a5a05d9262d582d5d391b9cfa4ecafdb166b4cd9a903f958af9 WatchSource:0}: Error finding container 469dd771c9a61a5a05d9262d582d5d391b9cfa4ecafdb166b4cd9a903f958af9: Status 404 returned error can't find the container with id 469dd771c9a61a5a05d9262d582d5d391b9cfa4ecafdb166b4cd9a903f958af9 Apr 23 08:14:14.202638 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:14.202612 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tsgfg\" (UniqueName: \"kubernetes.io/projected/93ac6a8e-10aa-4687-be43-6d712bee9ebd-kube-api-access-tsgfg\") pod \"network-check-target-wrt6p\" (UID: \"93ac6a8e-10aa-4687-be43-6d712bee9ebd\") " pod="openshift-network-diagnostics/network-check-target-wrt6p" Apr 23 08:14:14.202765 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:14.202751 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 08:14:14.202802 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:14.202769 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 08:14:14.202802 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:14.202778 2570 projected.go:194] Error preparing data for projected volume kube-api-access-tsgfg for pod openshift-network-diagnostics/network-check-target-wrt6p: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:14:14.202861 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:14.202821 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/93ac6a8e-10aa-4687-be43-6d712bee9ebd-kube-api-access-tsgfg podName:93ac6a8e-10aa-4687-be43-6d712bee9ebd nodeName:}" failed. No retries permitted until 2026-04-23 08:14:15.20280699 +0000 UTC m=+4.150994197 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-tsgfg" (UniqueName: "kubernetes.io/projected/93ac6a8e-10aa-4687-be43-6d712bee9ebd-kube-api-access-tsgfg") pod "network-check-target-wrt6p" (UID: "93ac6a8e-10aa-4687-be43-6d712bee9ebd") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:14:14.532026 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:14.531827 2570 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 08:09:12 +0000 UTC" deadline="2027-10-16 14:16:29.576434488 +0000 UTC" Apr 23 08:14:14.532026 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:14.531865 2570 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12990h2m15.044573235s" Apr 23 08:14:14.605547 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:14.604833 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-129.ec2.internal" event={"ID":"a24ff0e0a63ca37f66f4f5e5712330cc","Type":"ContainerStarted","Data":"6620ec098d1008e2179b7f964a202acdb6e5d0a3fb55294f1b1ca512e806004b"} Apr 23 08:14:14.610325 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:14.610255 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rctnw" event={"ID":"575d4e03-f407-47ca-9efc-8c7bee335d30","Type":"ContainerStarted","Data":"1857495b099f6f950a18760268b9f8a0a788b59115fa3940cf695c766fab3a17"} Apr 23 08:14:14.612504 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:14.612406 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsvjc" event={"ID":"ffbe5334-bba8-45bc-bd64-2141ea3f49a8","Type":"ContainerStarted","Data":"7d74df29ae0342eb3a997d4a782ae489cfd12f77e592ce7cfe4a49fd78fd41a6"} Apr 23 08:14:14.618841 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:14.618091 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-129.ec2.internal" podStartSLOduration=2.61807312 podStartE2EDuration="2.61807312s" podCreationTimestamp="2026-04-23 08:14:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 08:14:14.617828129 +0000 UTC m=+3.566015359" watchObservedRunningTime="2026-04-23 08:14:14.61807312 +0000 UTC m=+3.566260351" Apr 23 08:14:14.624142 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:14.624110 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-z2ckf" event={"ID":"75c70767-827d-46b1-acb4-c76aff02f4bd","Type":"ContainerStarted","Data":"d857ebf5755c65fc5568f186719b43b446261746e6cabd5f73a75ac57d563074"} Apr 23 08:14:14.628736 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:14.628689 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-fgmqg" event={"ID":"749e4605-b37e-4004-a2dc-68092884ddae","Type":"ContainerStarted","Data":"c5230c219aa01f4d7966f68e89cb9fef181212b9d6e85d1de4949ba0ee18f401"} Apr 23 08:14:14.636042 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:14.636016 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wdj9s" event={"ID":"5da47bef-96ab-409c-80f2-3c5cb2356454","Type":"ContainerStarted","Data":"469dd771c9a61a5a05d9262d582d5d391b9cfa4ecafdb166b4cd9a903f958af9"} Apr 23 08:14:14.639667 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:14.639617 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-7wvqt" event={"ID":"887c87c3-07ae-4b74-aa64-fe19546746e0","Type":"ContainerStarted","Data":"c3767c2084253b716b86174ded14c2a8a47cadcd5c834dac5f2844cf87e0c2ce"} Apr 23 08:14:14.641914 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:14.641887 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-mslvd" event={"ID":"9f4d9f29-8efc-4799-b966-20f9d049fd33","Type":"ContainerStarted","Data":"59108e484ef363806731651d5924cb2f2961ba809c543a2d834d7ccd653cbbd4"} Apr 23 08:14:14.645850 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:14.645822 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-hrjxz" event={"ID":"f033049a-1b87-451a-a1fc-53b7ebf036df","Type":"ContainerStarted","Data":"a7effd0610153710a285cddab3cc0e393cffa92e74487d8dd2d13c5449bae937"} Apr 23 08:14:14.647729 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:14.647695 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-cxzv6" event={"ID":"fca14dff-5b1c-41d2-adc2-0d42ef722a54","Type":"ContainerStarted","Data":"a94ddb0ecba308d741f4f7b3b87fa7090681759d3ecf48aff86db797e7cd0c9a"} Apr 23 08:14:15.113690 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:15.113652 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5e2ddc62-96a8-4827-87cc-a0b37c9a5e9f-metrics-certs\") pod \"network-metrics-daemon-4gzjb\" (UID: \"5e2ddc62-96a8-4827-87cc-a0b37c9a5e9f\") " pod="openshift-multus/network-metrics-daemon-4gzjb" Apr 23 08:14:15.113863 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:15.113811 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:14:15.113921 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:15.113874 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5e2ddc62-96a8-4827-87cc-a0b37c9a5e9f-metrics-certs podName:5e2ddc62-96a8-4827-87cc-a0b37c9a5e9f nodeName:}" failed. No retries permitted until 2026-04-23 08:14:17.113856839 +0000 UTC m=+6.062044054 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5e2ddc62-96a8-4827-87cc-a0b37c9a5e9f-metrics-certs") pod "network-metrics-daemon-4gzjb" (UID: "5e2ddc62-96a8-4827-87cc-a0b37c9a5e9f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:14:15.215630 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:15.214903 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tsgfg\" (UniqueName: \"kubernetes.io/projected/93ac6a8e-10aa-4687-be43-6d712bee9ebd-kube-api-access-tsgfg\") pod \"network-check-target-wrt6p\" (UID: \"93ac6a8e-10aa-4687-be43-6d712bee9ebd\") " pod="openshift-network-diagnostics/network-check-target-wrt6p" Apr 23 08:14:15.215630 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:15.215085 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 08:14:15.215630 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:15.215116 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 08:14:15.215630 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:15.215129 2570 projected.go:194] Error preparing data for projected volume kube-api-access-tsgfg for pod openshift-network-diagnostics/network-check-target-wrt6p: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:14:15.215630 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:15.215189 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/93ac6a8e-10aa-4687-be43-6d712bee9ebd-kube-api-access-tsgfg podName:93ac6a8e-10aa-4687-be43-6d712bee9ebd nodeName:}" failed. No retries permitted until 2026-04-23 08:14:17.215171229 +0000 UTC m=+6.163358443 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-tsgfg" (UniqueName: "kubernetes.io/projected/93ac6a8e-10aa-4687-be43-6d712bee9ebd-kube-api-access-tsgfg") pod "network-check-target-wrt6p" (UID: "93ac6a8e-10aa-4687-be43-6d712bee9ebd") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:14:15.589459 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:15.589383 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wrt6p" Apr 23 08:14:15.589868 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:15.589507 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wrt6p" podUID="93ac6a8e-10aa-4687-be43-6d712bee9ebd" Apr 23 08:14:15.589943 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:15.589927 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4gzjb" Apr 23 08:14:15.590047 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:15.590026 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4gzjb" podUID="5e2ddc62-96a8-4827-87cc-a0b37c9a5e9f" Apr 23 08:14:15.658047 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:15.657328 2570 generic.go:358] "Generic (PLEG): container finished" podID="65646df45a52d0346b7d7e22ae4cefe7" containerID="1f2c78d0dc2159df7c5e158d9cd63a856b98b0ec92c0c7458b6cb902b650cff5" exitCode=0 Apr 23 08:14:15.658047 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:15.657453 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-129.ec2.internal" event={"ID":"65646df45a52d0346b7d7e22ae4cefe7","Type":"ContainerDied","Data":"1f2c78d0dc2159df7c5e158d9cd63a856b98b0ec92c0c7458b6cb902b650cff5"} Apr 23 08:14:16.667815 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:16.667733 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-129.ec2.internal" event={"ID":"65646df45a52d0346b7d7e22ae4cefe7","Type":"ContainerStarted","Data":"9d0b81cccb85492cd3f61939a36e8f5e0f22f74aba1d426ca3efdb0cd286aa91"} Apr 23 08:14:17.130693 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:17.130592 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5e2ddc62-96a8-4827-87cc-a0b37c9a5e9f-metrics-certs\") pod \"network-metrics-daemon-4gzjb\" (UID: \"5e2ddc62-96a8-4827-87cc-a0b37c9a5e9f\") " pod="openshift-multus/network-metrics-daemon-4gzjb" Apr 23 08:14:17.130851 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:17.130727 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:14:17.130851 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:17.130796 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5e2ddc62-96a8-4827-87cc-a0b37c9a5e9f-metrics-certs podName:5e2ddc62-96a8-4827-87cc-a0b37c9a5e9f nodeName:}" failed. No retries permitted until 2026-04-23 08:14:21.130776265 +0000 UTC m=+10.078963485 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5e2ddc62-96a8-4827-87cc-a0b37c9a5e9f-metrics-certs") pod "network-metrics-daemon-4gzjb" (UID: "5e2ddc62-96a8-4827-87cc-a0b37c9a5e9f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:14:17.231159 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:17.231124 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tsgfg\" (UniqueName: \"kubernetes.io/projected/93ac6a8e-10aa-4687-be43-6d712bee9ebd-kube-api-access-tsgfg\") pod \"network-check-target-wrt6p\" (UID: \"93ac6a8e-10aa-4687-be43-6d712bee9ebd\") " pod="openshift-network-diagnostics/network-check-target-wrt6p" Apr 23 08:14:17.231375 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:17.231319 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 08:14:17.231375 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:17.231339 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 08:14:17.231375 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:17.231351 2570 projected.go:194] Error preparing data for projected volume kube-api-access-tsgfg for pod openshift-network-diagnostics/network-check-target-wrt6p: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:14:17.231562 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:17.231409 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/93ac6a8e-10aa-4687-be43-6d712bee9ebd-kube-api-access-tsgfg podName:93ac6a8e-10aa-4687-be43-6d712bee9ebd nodeName:}" failed. No retries permitted until 2026-04-23 08:14:21.231390044 +0000 UTC m=+10.179577254 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-tsgfg" (UniqueName: "kubernetes.io/projected/93ac6a8e-10aa-4687-be43-6d712bee9ebd-kube-api-access-tsgfg") pod "network-check-target-wrt6p" (UID: "93ac6a8e-10aa-4687-be43-6d712bee9ebd") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:14:17.586846 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:17.586766 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wrt6p" Apr 23 08:14:17.587017 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:17.586909 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wrt6p" podUID="93ac6a8e-10aa-4687-be43-6d712bee9ebd" Apr 23 08:14:17.587017 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:17.586969 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4gzjb" Apr 23 08:14:17.587144 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:17.587102 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4gzjb" podUID="5e2ddc62-96a8-4827-87cc-a0b37c9a5e9f" Apr 23 08:14:19.586228 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:19.586183 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4gzjb" Apr 23 08:14:19.586738 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:19.586237 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wrt6p" Apr 23 08:14:19.586738 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:19.586337 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4gzjb" podUID="5e2ddc62-96a8-4827-87cc-a0b37c9a5e9f" Apr 23 08:14:19.586738 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:19.586417 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wrt6p" podUID="93ac6a8e-10aa-4687-be43-6d712bee9ebd" Apr 23 08:14:21.168722 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:21.168676 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5e2ddc62-96a8-4827-87cc-a0b37c9a5e9f-metrics-certs\") pod \"network-metrics-daemon-4gzjb\" (UID: \"5e2ddc62-96a8-4827-87cc-a0b37c9a5e9f\") " pod="openshift-multus/network-metrics-daemon-4gzjb" Apr 23 08:14:21.169246 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:21.168848 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:14:21.169246 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:21.168945 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5e2ddc62-96a8-4827-87cc-a0b37c9a5e9f-metrics-certs podName:5e2ddc62-96a8-4827-87cc-a0b37c9a5e9f nodeName:}" failed. No retries permitted until 2026-04-23 08:14:29.168923814 +0000 UTC m=+18.117111043 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5e2ddc62-96a8-4827-87cc-a0b37c9a5e9f-metrics-certs") pod "network-metrics-daemon-4gzjb" (UID: "5e2ddc62-96a8-4827-87cc-a0b37c9a5e9f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:14:21.269599 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:21.269309 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tsgfg\" (UniqueName: \"kubernetes.io/projected/93ac6a8e-10aa-4687-be43-6d712bee9ebd-kube-api-access-tsgfg\") pod \"network-check-target-wrt6p\" (UID: \"93ac6a8e-10aa-4687-be43-6d712bee9ebd\") " pod="openshift-network-diagnostics/network-check-target-wrt6p" Apr 23 08:14:21.269599 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:21.269505 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 08:14:21.269599 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:21.269525 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 08:14:21.269599 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:21.269536 2570 projected.go:194] Error preparing data for projected volume kube-api-access-tsgfg for pod openshift-network-diagnostics/network-check-target-wrt6p: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:14:21.269599 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:21.269595 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/93ac6a8e-10aa-4687-be43-6d712bee9ebd-kube-api-access-tsgfg podName:93ac6a8e-10aa-4687-be43-6d712bee9ebd nodeName:}" failed. No retries permitted until 2026-04-23 08:14:29.269576802 +0000 UTC m=+18.217764032 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-tsgfg" (UniqueName: "kubernetes.io/projected/93ac6a8e-10aa-4687-be43-6d712bee9ebd-kube-api-access-tsgfg") pod "network-check-target-wrt6p" (UID: "93ac6a8e-10aa-4687-be43-6d712bee9ebd") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:14:21.586514 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:21.586429 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wrt6p" Apr 23 08:14:21.586680 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:21.586541 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wrt6p" podUID="93ac6a8e-10aa-4687-be43-6d712bee9ebd" Apr 23 08:14:21.586930 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:21.586912 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4gzjb" Apr 23 08:14:21.587038 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:21.587019 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4gzjb" podUID="5e2ddc62-96a8-4827-87cc-a0b37c9a5e9f" Apr 23 08:14:22.975271 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:22.975216 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-129.ec2.internal" podStartSLOduration=10.975184801 podStartE2EDuration="10.975184801s" podCreationTimestamp="2026-04-23 08:14:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 08:14:16.681618842 +0000 UTC m=+5.629806071" watchObservedRunningTime="2026-04-23 08:14:22.975184801 +0000 UTC m=+11.923372032" Apr 23 08:14:22.976091 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:22.976068 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-kssx7"] Apr 23 08:14:22.996944 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:22.996920 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kssx7" Apr 23 08:14:22.997088 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:22.997011 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kssx7" podUID="48ea9757-031f-4dbe-bd48-c51513e48d24" Apr 23 08:14:23.084838 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:23.084787 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/48ea9757-031f-4dbe-bd48-c51513e48d24-original-pull-secret\") pod \"global-pull-secret-syncer-kssx7\" (UID: \"48ea9757-031f-4dbe-bd48-c51513e48d24\") " pod="kube-system/global-pull-secret-syncer-kssx7" Apr 23 08:14:23.084838 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:23.084842 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/48ea9757-031f-4dbe-bd48-c51513e48d24-kubelet-config\") pod \"global-pull-secret-syncer-kssx7\" (UID: \"48ea9757-031f-4dbe-bd48-c51513e48d24\") " pod="kube-system/global-pull-secret-syncer-kssx7" Apr 23 08:14:23.085079 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:23.084893 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/48ea9757-031f-4dbe-bd48-c51513e48d24-dbus\") pod \"global-pull-secret-syncer-kssx7\" (UID: \"48ea9757-031f-4dbe-bd48-c51513e48d24\") " pod="kube-system/global-pull-secret-syncer-kssx7" Apr 23 08:14:23.186399 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:23.186354 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/48ea9757-031f-4dbe-bd48-c51513e48d24-original-pull-secret\") pod \"global-pull-secret-syncer-kssx7\" (UID: \"48ea9757-031f-4dbe-bd48-c51513e48d24\") " pod="kube-system/global-pull-secret-syncer-kssx7" Apr 23 08:14:23.186550 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:23.186420 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/48ea9757-031f-4dbe-bd48-c51513e48d24-kubelet-config\") pod \"global-pull-secret-syncer-kssx7\" (UID: \"48ea9757-031f-4dbe-bd48-c51513e48d24\") " pod="kube-system/global-pull-secret-syncer-kssx7" Apr 23 08:14:23.186550 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:23.186445 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/48ea9757-031f-4dbe-bd48-c51513e48d24-dbus\") pod \"global-pull-secret-syncer-kssx7\" (UID: \"48ea9757-031f-4dbe-bd48-c51513e48d24\") " pod="kube-system/global-pull-secret-syncer-kssx7" Apr 23 08:14:23.186550 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:23.186470 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 08:14:23.186550 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:23.186549 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/48ea9757-031f-4dbe-bd48-c51513e48d24-original-pull-secret podName:48ea9757-031f-4dbe-bd48-c51513e48d24 nodeName:}" failed. No retries permitted until 2026-04-23 08:14:23.686533107 +0000 UTC m=+12.634720319 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/48ea9757-031f-4dbe-bd48-c51513e48d24-original-pull-secret") pod "global-pull-secret-syncer-kssx7" (UID: "48ea9757-031f-4dbe-bd48-c51513e48d24") : object "kube-system"/"original-pull-secret" not registered Apr 23 08:14:23.186745 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:23.186549 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/48ea9757-031f-4dbe-bd48-c51513e48d24-kubelet-config\") pod \"global-pull-secret-syncer-kssx7\" (UID: \"48ea9757-031f-4dbe-bd48-c51513e48d24\") " pod="kube-system/global-pull-secret-syncer-kssx7" Apr 23 08:14:23.186745 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:23.186657 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/48ea9757-031f-4dbe-bd48-c51513e48d24-dbus\") pod \"global-pull-secret-syncer-kssx7\" (UID: \"48ea9757-031f-4dbe-bd48-c51513e48d24\") " pod="kube-system/global-pull-secret-syncer-kssx7" Apr 23 08:14:23.589141 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:23.589063 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4gzjb" Apr 23 08:14:23.589304 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:23.589063 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wrt6p" Apr 23 08:14:23.589304 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:23.589215 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4gzjb" podUID="5e2ddc62-96a8-4827-87cc-a0b37c9a5e9f" Apr 23 08:14:23.589425 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:23.589309 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wrt6p" podUID="93ac6a8e-10aa-4687-be43-6d712bee9ebd" Apr 23 08:14:23.691336 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:23.691289 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/48ea9757-031f-4dbe-bd48-c51513e48d24-original-pull-secret\") pod \"global-pull-secret-syncer-kssx7\" (UID: \"48ea9757-031f-4dbe-bd48-c51513e48d24\") " pod="kube-system/global-pull-secret-syncer-kssx7" Apr 23 08:14:23.691479 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:23.691446 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 08:14:23.691536 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:23.691524 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/48ea9757-031f-4dbe-bd48-c51513e48d24-original-pull-secret podName:48ea9757-031f-4dbe-bd48-c51513e48d24 nodeName:}" failed. No retries permitted until 2026-04-23 08:14:24.691506156 +0000 UTC m=+13.639693365 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/48ea9757-031f-4dbe-bd48-c51513e48d24-original-pull-secret") pod "global-pull-secret-syncer-kssx7" (UID: "48ea9757-031f-4dbe-bd48-c51513e48d24") : object "kube-system"/"original-pull-secret" not registered Apr 23 08:14:24.586326 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:24.586295 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kssx7" Apr 23 08:14:24.586759 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:24.586426 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kssx7" podUID="48ea9757-031f-4dbe-bd48-c51513e48d24" Apr 23 08:14:24.698114 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:24.698076 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/48ea9757-031f-4dbe-bd48-c51513e48d24-original-pull-secret\") pod \"global-pull-secret-syncer-kssx7\" (UID: \"48ea9757-031f-4dbe-bd48-c51513e48d24\") " pod="kube-system/global-pull-secret-syncer-kssx7" Apr 23 08:14:24.698295 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:24.698241 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 08:14:24.698345 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:24.698305 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/48ea9757-031f-4dbe-bd48-c51513e48d24-original-pull-secret podName:48ea9757-031f-4dbe-bd48-c51513e48d24 nodeName:}" failed. No retries permitted until 2026-04-23 08:14:26.69828982 +0000 UTC m=+15.646477027 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/48ea9757-031f-4dbe-bd48-c51513e48d24-original-pull-secret") pod "global-pull-secret-syncer-kssx7" (UID: "48ea9757-031f-4dbe-bd48-c51513e48d24") : object "kube-system"/"original-pull-secret" not registered Apr 23 08:14:25.585939 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:25.585911 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4gzjb" Apr 23 08:14:25.586111 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:25.586018 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4gzjb" podUID="5e2ddc62-96a8-4827-87cc-a0b37c9a5e9f" Apr 23 08:14:25.586111 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:25.586077 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wrt6p" Apr 23 08:14:25.586319 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:25.586172 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wrt6p" podUID="93ac6a8e-10aa-4687-be43-6d712bee9ebd" Apr 23 08:14:26.586406 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:26.586375 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kssx7" Apr 23 08:14:26.586805 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:26.586490 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kssx7" podUID="48ea9757-031f-4dbe-bd48-c51513e48d24" Apr 23 08:14:26.712891 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:26.712859 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/48ea9757-031f-4dbe-bd48-c51513e48d24-original-pull-secret\") pod \"global-pull-secret-syncer-kssx7\" (UID: \"48ea9757-031f-4dbe-bd48-c51513e48d24\") " pod="kube-system/global-pull-secret-syncer-kssx7" Apr 23 08:14:26.713046 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:26.712973 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 08:14:26.713046 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:26.713019 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/48ea9757-031f-4dbe-bd48-c51513e48d24-original-pull-secret podName:48ea9757-031f-4dbe-bd48-c51513e48d24 nodeName:}" failed. No retries permitted until 2026-04-23 08:14:30.713005918 +0000 UTC m=+19.661193126 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/48ea9757-031f-4dbe-bd48-c51513e48d24-original-pull-secret") pod "global-pull-secret-syncer-kssx7" (UID: "48ea9757-031f-4dbe-bd48-c51513e48d24") : object "kube-system"/"original-pull-secret" not registered Apr 23 08:14:27.585963 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:27.585921 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4gzjb" Apr 23 08:14:27.586154 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:27.585968 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wrt6p" Apr 23 08:14:27.586154 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:27.586053 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4gzjb" podUID="5e2ddc62-96a8-4827-87cc-a0b37c9a5e9f" Apr 23 08:14:27.586291 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:27.586174 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wrt6p" podUID="93ac6a8e-10aa-4687-be43-6d712bee9ebd" Apr 23 08:14:28.586413 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:28.586374 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kssx7" Apr 23 08:14:28.586898 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:28.586514 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kssx7" podUID="48ea9757-031f-4dbe-bd48-c51513e48d24" Apr 23 08:14:29.230180 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:29.230136 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5e2ddc62-96a8-4827-87cc-a0b37c9a5e9f-metrics-certs\") pod \"network-metrics-daemon-4gzjb\" (UID: \"5e2ddc62-96a8-4827-87cc-a0b37c9a5e9f\") " pod="openshift-multus/network-metrics-daemon-4gzjb" Apr 23 08:14:29.230374 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:29.230315 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:14:29.230431 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:29.230411 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5e2ddc62-96a8-4827-87cc-a0b37c9a5e9f-metrics-certs podName:5e2ddc62-96a8-4827-87cc-a0b37c9a5e9f nodeName:}" failed. No retries permitted until 2026-04-23 08:14:45.230368936 +0000 UTC m=+34.178556144 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5e2ddc62-96a8-4827-87cc-a0b37c9a5e9f-metrics-certs") pod "network-metrics-daemon-4gzjb" (UID: "5e2ddc62-96a8-4827-87cc-a0b37c9a5e9f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:14:29.331137 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:29.331092 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tsgfg\" (UniqueName: \"kubernetes.io/projected/93ac6a8e-10aa-4687-be43-6d712bee9ebd-kube-api-access-tsgfg\") pod \"network-check-target-wrt6p\" (UID: \"93ac6a8e-10aa-4687-be43-6d712bee9ebd\") " pod="openshift-network-diagnostics/network-check-target-wrt6p" Apr 23 08:14:29.331321 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:29.331263 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 08:14:29.331321 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:29.331289 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 08:14:29.331321 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:29.331300 2570 projected.go:194] Error preparing data for projected volume kube-api-access-tsgfg for pod openshift-network-diagnostics/network-check-target-wrt6p: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:14:29.331450 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:29.331355 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/93ac6a8e-10aa-4687-be43-6d712bee9ebd-kube-api-access-tsgfg podName:93ac6a8e-10aa-4687-be43-6d712bee9ebd nodeName:}" failed. No retries permitted until 2026-04-23 08:14:45.331341922 +0000 UTC m=+34.279529130 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-tsgfg" (UniqueName: "kubernetes.io/projected/93ac6a8e-10aa-4687-be43-6d712bee9ebd-kube-api-access-tsgfg") pod "network-check-target-wrt6p" (UID: "93ac6a8e-10aa-4687-be43-6d712bee9ebd") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:14:29.588292 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:29.588225 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wrt6p" Apr 23 08:14:29.588292 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:29.588274 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4gzjb" Apr 23 08:14:29.588651 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:29.588355 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4gzjb" podUID="5e2ddc62-96a8-4827-87cc-a0b37c9a5e9f" Apr 23 08:14:29.588651 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:29.588394 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wrt6p" podUID="93ac6a8e-10aa-4687-be43-6d712bee9ebd" Apr 23 08:14:30.586266 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:30.586233 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kssx7" Apr 23 08:14:30.586442 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:30.586360 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kssx7" podUID="48ea9757-031f-4dbe-bd48-c51513e48d24" Apr 23 08:14:30.738991 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:30.738946 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/48ea9757-031f-4dbe-bd48-c51513e48d24-original-pull-secret\") pod \"global-pull-secret-syncer-kssx7\" (UID: \"48ea9757-031f-4dbe-bd48-c51513e48d24\") " pod="kube-system/global-pull-secret-syncer-kssx7" Apr 23 08:14:30.739460 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:30.739139 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 08:14:30.739460 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:30.739240 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/48ea9757-031f-4dbe-bd48-c51513e48d24-original-pull-secret podName:48ea9757-031f-4dbe-bd48-c51513e48d24 nodeName:}" failed. No retries permitted until 2026-04-23 08:14:38.739217645 +0000 UTC m=+27.687404867 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/48ea9757-031f-4dbe-bd48-c51513e48d24-original-pull-secret") pod "global-pull-secret-syncer-kssx7" (UID: "48ea9757-031f-4dbe-bd48-c51513e48d24") : object "kube-system"/"original-pull-secret" not registered Apr 23 08:14:31.586249 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:31.585936 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4gzjb" Apr 23 08:14:31.586348 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:31.585944 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wrt6p" Apr 23 08:14:31.586348 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:31.586334 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4gzjb" podUID="5e2ddc62-96a8-4827-87cc-a0b37c9a5e9f" Apr 23 08:14:31.586470 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:31.586443 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wrt6p" podUID="93ac6a8e-10aa-4687-be43-6d712bee9ebd" Apr 23 08:14:31.697487 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:31.697458 2570 generic.go:358] "Generic (PLEG): container finished" podID="575d4e03-f407-47ca-9efc-8c7bee335d30" containerID="ff983e8ee2b89e83030479027c0ee41f600557dda2f66479a1a220ff61cd02f2" exitCode=0 Apr 23 08:14:31.697600 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:31.697537 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rctnw" event={"ID":"575d4e03-f407-47ca-9efc-8c7bee335d30","Type":"ContainerDied","Data":"ff983e8ee2b89e83030479027c0ee41f600557dda2f66479a1a220ff61cd02f2"} Apr 23 08:14:31.700485 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:31.700463 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsvjc_ffbe5334-bba8-45bc-bd64-2141ea3f49a8/ovn-acl-logging/0.log" Apr 23 08:14:31.700875 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:31.700851 2570 generic.go:358] "Generic (PLEG): container finished" podID="ffbe5334-bba8-45bc-bd64-2141ea3f49a8" containerID="f48889811b195fca4b4c72de0239d3f68c534f98904cc8ad1a89bf9254d57e66" exitCode=1 Apr 23 08:14:31.700969 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:31.700886 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsvjc" event={"ID":"ffbe5334-bba8-45bc-bd64-2141ea3f49a8","Type":"ContainerStarted","Data":"56aec7d1c41b4f86cd9e344b5f87e9ad80a03c76a5418c253ec74ae528a07515"} Apr 23 08:14:31.700969 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:31.700932 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsvjc" event={"ID":"ffbe5334-bba8-45bc-bd64-2141ea3f49a8","Type":"ContainerStarted","Data":"fff4e07283ff66270f2e26be811446d398d069e284c8ab5bf3ea071da3f3ce6c"} Apr 23 08:14:31.701073 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:31.700972 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsvjc" event={"ID":"ffbe5334-bba8-45bc-bd64-2141ea3f49a8","Type":"ContainerStarted","Data":"382262d37e4295a94c822ff82c87601adbc63209d76ac36eeccbde43c1095e98"} Apr 23 08:14:31.701073 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:31.700993 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsvjc" event={"ID":"ffbe5334-bba8-45bc-bd64-2141ea3f49a8","Type":"ContainerStarted","Data":"a974477fee54bb1243e1c0f0b09b75d6e3a1126f940fb0f20389c64b1175d90d"} Apr 23 08:14:31.701073 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:31.701011 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsvjc" event={"ID":"ffbe5334-bba8-45bc-bd64-2141ea3f49a8","Type":"ContainerDied","Data":"f48889811b195fca4b4c72de0239d3f68c534f98904cc8ad1a89bf9254d57e66"} Apr 23 08:14:31.701073 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:31.701034 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsvjc" event={"ID":"ffbe5334-bba8-45bc-bd64-2141ea3f49a8","Type":"ContainerStarted","Data":"7a94014b8b95080c450533373b8e616c8f9ba4d4b96c1bc1e8ff01f188583041"} Apr 23 08:14:31.702343 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:31.702302 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-z2ckf" event={"ID":"75c70767-827d-46b1-acb4-c76aff02f4bd","Type":"ContainerStarted","Data":"e3924e9ca7d7a6d1a15d0ed61ad96af1186b9054ce98647a8361d9245b9d1afd"} Apr 23 08:14:31.703865 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:31.703839 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-fgmqg" event={"ID":"749e4605-b37e-4004-a2dc-68092884ddae","Type":"ContainerStarted","Data":"75fe663df5bcf9268f871d8bc7612d98b53462043f9bd96d03ba96509c4e6366"} Apr 23 08:14:31.705779 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:31.705760 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wdj9s" event={"ID":"5da47bef-96ab-409c-80f2-3c5cb2356454","Type":"ContainerStarted","Data":"40f9e03afe42dc810361b6b189c16136a7b96a1a94f3eae74c6459ad7e27e5b7"} Apr 23 08:14:31.706988 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:31.706966 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-mslvd" event={"ID":"9f4d9f29-8efc-4799-b966-20f9d049fd33","Type":"ContainerStarted","Data":"3beaee7712329db18023df72e2234b272b2d5a2c770ee086fa8a1223cfe041bb"} Apr 23 08:14:31.708175 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:31.708159 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-hrjxz" event={"ID":"f033049a-1b87-451a-a1fc-53b7ebf036df","Type":"ContainerStarted","Data":"2e16e927312e3e431ad903ed51e2de6242948572419d1dbf128d7fdab83733b2"} Apr 23 08:14:31.712734 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:31.709949 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-cxzv6" event={"ID":"fca14dff-5b1c-41d2-adc2-0d42ef722a54","Type":"ContainerStarted","Data":"bcaddfb53938448c50a165f3172c79db153f5d8c7d17a71fb02c319154665176"} Apr 23 08:14:31.725662 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:31.725622 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-cxzv6" podStartSLOduration=3.851605569 podStartE2EDuration="20.725611911s" podCreationTimestamp="2026-04-23 08:14:11 +0000 UTC" firstStartedPulling="2026-04-23 08:14:14.190099899 +0000 UTC m=+3.138287108" lastFinishedPulling="2026-04-23 08:14:31.064106242 +0000 UTC m=+20.012293450" observedRunningTime="2026-04-23 08:14:31.72561182 +0000 UTC m=+20.673799048" watchObservedRunningTime="2026-04-23 08:14:31.725611911 +0000 UTC m=+20.673799139" Apr 23 08:14:31.737270 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:31.737232 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-hrjxz" podStartSLOduration=3.666326042 podStartE2EDuration="20.737221713s" podCreationTimestamp="2026-04-23 08:14:11 +0000 UTC" firstStartedPulling="2026-04-23 08:14:14.191702574 +0000 UTC m=+3.139889784" lastFinishedPulling="2026-04-23 08:14:31.262598243 +0000 UTC m=+20.210785455" observedRunningTime="2026-04-23 08:14:31.736965124 +0000 UTC m=+20.685152364" watchObservedRunningTime="2026-04-23 08:14:31.737221713 +0000 UTC m=+20.685408942" Apr 23 08:14:31.748838 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:31.748806 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-z2ckf" podStartSLOduration=3.868344362 podStartE2EDuration="20.74879599s" podCreationTimestamp="2026-04-23 08:14:11 +0000 UTC" firstStartedPulling="2026-04-23 08:14:14.183597718 +0000 UTC m=+3.131784924" lastFinishedPulling="2026-04-23 08:14:31.064049334 +0000 UTC m=+20.012236552" observedRunningTime="2026-04-23 08:14:31.74836235 +0000 UTC m=+20.696549589" watchObservedRunningTime="2026-04-23 08:14:31.74879599 +0000 UTC m=+20.696983220" Apr 23 08:14:31.784683 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:31.784645 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-mslvd" podStartSLOduration=3.913752895 podStartE2EDuration="20.784633477s" podCreationTimestamp="2026-04-23 08:14:11 +0000 UTC" firstStartedPulling="2026-04-23 08:14:14.193144812 +0000 UTC m=+3.141332020" lastFinishedPulling="2026-04-23 08:14:31.064025387 +0000 UTC m=+20.012212602" observedRunningTime="2026-04-23 08:14:31.774251502 +0000 UTC m=+20.722438731" watchObservedRunningTime="2026-04-23 08:14:31.784633477 +0000 UTC m=+20.732820706" Apr 23 08:14:31.784846 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:31.784827 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-fgmqg" podStartSLOduration=3.902465222 podStartE2EDuration="20.784821051s" podCreationTimestamp="2026-04-23 08:14:11 +0000 UTC" firstStartedPulling="2026-04-23 08:14:14.181694252 +0000 UTC m=+3.129881459" lastFinishedPulling="2026-04-23 08:14:31.064050081 +0000 UTC m=+20.012237288" observedRunningTime="2026-04-23 08:14:31.784604393 +0000 UTC m=+20.732791632" watchObservedRunningTime="2026-04-23 08:14:31.784821051 +0000 UTC m=+20.733008278" Apr 23 08:14:32.535812 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:32.535612 2570 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 23 08:14:32.560927 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:32.560830 2570 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-23T08:14:32.535808248Z","UUID":"14df1c16-95ac-4b87-869f-d8c4cbc952f3","Handler":null,"Name":"","Endpoint":""} Apr 23 08:14:32.563333 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:32.563314 2570 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 23 08:14:32.563435 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:32.563340 2570 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 23 08:14:32.586055 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:32.586029 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kssx7" Apr 23 08:14:32.586236 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:32.586211 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kssx7" podUID="48ea9757-031f-4dbe-bd48-c51513e48d24" Apr 23 08:14:32.714001 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:32.713914 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wdj9s" event={"ID":"5da47bef-96ab-409c-80f2-3c5cb2356454","Type":"ContainerStarted","Data":"644e145084b74d87d3d187bdebdc3119d912b1cd47bee34d82f3057f83a413e2"} Apr 23 08:14:32.715413 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:32.715386 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-7wvqt" event={"ID":"887c87c3-07ae-4b74-aa64-fe19546746e0","Type":"ContainerStarted","Data":"b0ab46180577350b7d5fc6357f74822ccdecf02406621cbd4ac935613e7fe085"} Apr 23 08:14:32.727973 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:32.727930 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-7wvqt" podStartSLOduration=4.856616373 podStartE2EDuration="21.727913342s" podCreationTimestamp="2026-04-23 08:14:11 +0000 UTC" firstStartedPulling="2026-04-23 08:14:14.19276255 +0000 UTC m=+3.140949758" lastFinishedPulling="2026-04-23 08:14:31.064059504 +0000 UTC m=+20.012246727" observedRunningTime="2026-04-23 08:14:32.727814827 +0000 UTC m=+21.676002097" watchObservedRunningTime="2026-04-23 08:14:32.727913342 +0000 UTC m=+21.676100559" Apr 23 08:14:33.586003 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:33.585965 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4gzjb" Apr 23 08:14:33.586634 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:33.586114 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4gzjb" podUID="5e2ddc62-96a8-4827-87cc-a0b37c9a5e9f" Apr 23 08:14:33.586634 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:33.586212 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wrt6p" Apr 23 08:14:33.586634 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:33.586320 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wrt6p" podUID="93ac6a8e-10aa-4687-be43-6d712bee9ebd" Apr 23 08:14:33.721160 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:33.721135 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsvjc_ffbe5334-bba8-45bc-bd64-2141ea3f49a8/ovn-acl-logging/0.log" Apr 23 08:14:33.721583 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:33.721554 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsvjc" event={"ID":"ffbe5334-bba8-45bc-bd64-2141ea3f49a8","Type":"ContainerStarted","Data":"e15eea9c7b7306b5b8337193c11e484d6494c1ed2ff0ea122effec49b2e4e22b"} Apr 23 08:14:33.723941 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:33.723859 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wdj9s" event={"ID":"5da47bef-96ab-409c-80f2-3c5cb2356454","Type":"ContainerStarted","Data":"cc9d8ac0017be03b171f94814085fade8ec7c7f750822bd356aa881800d48050"} Apr 23 08:14:33.752870 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:33.752825 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wdj9s" podStartSLOduration=3.407468498 podStartE2EDuration="22.752807991s" podCreationTimestamp="2026-04-23 08:14:11 +0000 UTC" firstStartedPulling="2026-04-23 08:14:14.194078307 +0000 UTC m=+3.142265518" lastFinishedPulling="2026-04-23 08:14:33.539417803 +0000 UTC m=+22.487605011" observedRunningTime="2026-04-23 08:14:33.752520859 +0000 UTC m=+22.700708086" watchObservedRunningTime="2026-04-23 08:14:33.752807991 +0000 UTC m=+22.700995220" Apr 23 08:14:34.089726 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:34.089689 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-z2ckf" Apr 23 08:14:34.090343 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:34.090317 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-z2ckf" Apr 23 08:14:34.586542 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:34.586466 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kssx7" Apr 23 08:14:34.587019 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:34.586600 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kssx7" podUID="48ea9757-031f-4dbe-bd48-c51513e48d24" Apr 23 08:14:34.726950 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:34.726776 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-z2ckf" Apr 23 08:14:34.727259 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:34.727241 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-z2ckf" Apr 23 08:14:35.586093 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:35.586059 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wrt6p" Apr 23 08:14:35.586093 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:35.586092 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4gzjb" Apr 23 08:14:35.586353 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:35.586186 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wrt6p" podUID="93ac6a8e-10aa-4687-be43-6d712bee9ebd" Apr 23 08:14:35.586353 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:35.586310 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4gzjb" podUID="5e2ddc62-96a8-4827-87cc-a0b37c9a5e9f" Apr 23 08:14:36.586020 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:36.585826 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kssx7" Apr 23 08:14:36.586556 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:36.586061 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kssx7" podUID="48ea9757-031f-4dbe-bd48-c51513e48d24" Apr 23 08:14:36.734413 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:36.734384 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rctnw" event={"ID":"575d4e03-f407-47ca-9efc-8c7bee335d30","Type":"ContainerStarted","Data":"639bea1f37dcb8139ec2b4d738cbc90d5e26a13949c8731e249b0abb9c2d38b2"} Apr 23 08:14:36.736870 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:36.736852 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsvjc_ffbe5334-bba8-45bc-bd64-2141ea3f49a8/ovn-acl-logging/0.log" Apr 23 08:14:36.737117 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:36.737095 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsvjc" event={"ID":"ffbe5334-bba8-45bc-bd64-2141ea3f49a8","Type":"ContainerStarted","Data":"3a287ac6c6cd3dc2fec6e83bcc4244ab47b12251b181f8f5c1ed75be4c7cbff5"} Apr 23 08:14:36.737456 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:36.737424 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-vsvjc" Apr 23 08:14:36.737580 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:36.737562 2570 scope.go:117] "RemoveContainer" containerID="f48889811b195fca4b4c72de0239d3f68c534f98904cc8ad1a89bf9254d57e66" Apr 23 08:14:36.756522 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:36.756501 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vsvjc" Apr 23 08:14:37.585699 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:37.585672 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4gzjb" Apr 23 08:14:37.585876 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:37.585680 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wrt6p" Apr 23 08:14:37.585876 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:37.585768 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4gzjb" podUID="5e2ddc62-96a8-4827-87cc-a0b37c9a5e9f" Apr 23 08:14:37.585966 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:37.585879 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wrt6p" podUID="93ac6a8e-10aa-4687-be43-6d712bee9ebd" Apr 23 08:14:37.740587 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:37.740554 2570 generic.go:358] "Generic (PLEG): container finished" podID="575d4e03-f407-47ca-9efc-8c7bee335d30" containerID="639bea1f37dcb8139ec2b4d738cbc90d5e26a13949c8731e249b0abb9c2d38b2" exitCode=0 Apr 23 08:14:37.741072 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:37.740648 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rctnw" event={"ID":"575d4e03-f407-47ca-9efc-8c7bee335d30","Type":"ContainerDied","Data":"639bea1f37dcb8139ec2b4d738cbc90d5e26a13949c8731e249b0abb9c2d38b2"} Apr 23 08:14:37.746668 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:37.746646 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsvjc_ffbe5334-bba8-45bc-bd64-2141ea3f49a8/ovn-acl-logging/0.log" Apr 23 08:14:37.747053 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:37.747031 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsvjc" event={"ID":"ffbe5334-bba8-45bc-bd64-2141ea3f49a8","Type":"ContainerStarted","Data":"17e18d55d182ed57a8d0bded4899f2dc6d8489d36237bd54df7c6cfbc114a077"} Apr 23 08:14:37.747277 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:37.747262 2570 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 23 08:14:37.747815 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:37.747797 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-vsvjc" Apr 23 08:14:37.769223 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:37.767412 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vsvjc" Apr 23 08:14:37.785440 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:37.785380 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-vsvjc" podStartSLOduration=9.803574372 podStartE2EDuration="26.78536678s" podCreationTimestamp="2026-04-23 08:14:11 +0000 UTC" firstStartedPulling="2026-04-23 08:14:14.185071537 +0000 UTC m=+3.133258743" lastFinishedPulling="2026-04-23 08:14:31.166863931 +0000 UTC m=+20.115051151" observedRunningTime="2026-04-23 08:14:37.784741912 +0000 UTC m=+26.732929134" watchObservedRunningTime="2026-04-23 08:14:37.78536678 +0000 UTC m=+26.733554009" Apr 23 08:14:38.408165 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:38.408126 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-kssx7"] Apr 23 08:14:38.408368 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:38.408300 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kssx7" Apr 23 08:14:38.408431 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:38.408393 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kssx7" podUID="48ea9757-031f-4dbe-bd48-c51513e48d24" Apr 23 08:14:38.411017 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:38.410986 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-wrt6p"] Apr 23 08:14:38.411214 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:38.411114 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wrt6p" Apr 23 08:14:38.411275 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:38.411222 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wrt6p" podUID="93ac6a8e-10aa-4687-be43-6d712bee9ebd" Apr 23 08:14:38.411536 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:38.411513 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-4gzjb"] Apr 23 08:14:38.411634 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:38.411622 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4gzjb" Apr 23 08:14:38.411738 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:38.411720 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4gzjb" podUID="5e2ddc62-96a8-4827-87cc-a0b37c9a5e9f" Apr 23 08:14:38.751379 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:38.751049 2570 generic.go:358] "Generic (PLEG): container finished" podID="575d4e03-f407-47ca-9efc-8c7bee335d30" containerID="d813678c8bbb67d081fc498504aa50eefd95e1051bd51fa53a6b66669e2ae04f" exitCode=0 Apr 23 08:14:38.751709 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:38.751137 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rctnw" event={"ID":"575d4e03-f407-47ca-9efc-8c7bee335d30","Type":"ContainerDied","Data":"d813678c8bbb67d081fc498504aa50eefd95e1051bd51fa53a6b66669e2ae04f"} Apr 23 08:14:38.751709 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:38.751545 2570 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 23 08:14:38.805395 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:38.805362 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/48ea9757-031f-4dbe-bd48-c51513e48d24-original-pull-secret\") pod \"global-pull-secret-syncer-kssx7\" (UID: \"48ea9757-031f-4dbe-bd48-c51513e48d24\") " pod="kube-system/global-pull-secret-syncer-kssx7" Apr 23 08:14:38.805562 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:38.805513 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 08:14:38.805605 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:38.805575 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/48ea9757-031f-4dbe-bd48-c51513e48d24-original-pull-secret podName:48ea9757-031f-4dbe-bd48-c51513e48d24 nodeName:}" failed. No retries permitted until 2026-04-23 08:14:54.805560323 +0000 UTC m=+43.753747530 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/48ea9757-031f-4dbe-bd48-c51513e48d24-original-pull-secret") pod "global-pull-secret-syncer-kssx7" (UID: "48ea9757-031f-4dbe-bd48-c51513e48d24") : object "kube-system"/"original-pull-secret" not registered Apr 23 08:14:39.586026 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:39.585984 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4gzjb" Apr 23 08:14:39.586177 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:39.586107 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4gzjb" podUID="5e2ddc62-96a8-4827-87cc-a0b37c9a5e9f" Apr 23 08:14:39.586259 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:39.586172 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kssx7" Apr 23 08:14:39.586324 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:39.586303 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kssx7" podUID="48ea9757-031f-4dbe-bd48-c51513e48d24" Apr 23 08:14:39.759093 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:39.756216 2570 generic.go:358] "Generic (PLEG): container finished" podID="575d4e03-f407-47ca-9efc-8c7bee335d30" containerID="7ce7613a10ddc6a9d48f037afa41ba83a57a00a5e455815748aa469aa8820e76" exitCode=0 Apr 23 08:14:39.759093 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:39.756664 2570 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 23 08:14:39.759093 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:39.756697 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rctnw" event={"ID":"575d4e03-f407-47ca-9efc-8c7bee335d30","Type":"ContainerDied","Data":"7ce7613a10ddc6a9d48f037afa41ba83a57a00a5e455815748aa469aa8820e76"} Apr 23 08:14:40.585780 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:40.585749 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wrt6p" Apr 23 08:14:40.585935 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:40.585876 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wrt6p" podUID="93ac6a8e-10aa-4687-be43-6d712bee9ebd" Apr 23 08:14:40.772797 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:40.772709 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-vsvjc" Apr 23 08:14:41.586383 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:41.586343 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kssx7" Apr 23 08:14:41.586530 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:41.586471 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kssx7" podUID="48ea9757-031f-4dbe-bd48-c51513e48d24" Apr 23 08:14:41.586530 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:41.586518 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4gzjb" Apr 23 08:14:41.586676 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:41.586648 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4gzjb" podUID="5e2ddc62-96a8-4827-87cc-a0b37c9a5e9f" Apr 23 08:14:42.585933 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:42.585901 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wrt6p" Apr 23 08:14:42.586489 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:42.586022 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wrt6p" podUID="93ac6a8e-10aa-4687-be43-6d712bee9ebd" Apr 23 08:14:43.585905 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:43.585862 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4gzjb" Apr 23 08:14:43.585905 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:43.585891 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kssx7" Apr 23 08:14:43.586574 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:43.586001 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4gzjb" podUID="5e2ddc62-96a8-4827-87cc-a0b37c9a5e9f" Apr 23 08:14:43.586574 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:43.586148 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kssx7" podUID="48ea9757-031f-4dbe-bd48-c51513e48d24" Apr 23 08:14:44.381205 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.381175 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-129.ec2.internal" event="NodeReady" Apr 23 08:14:44.381388 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.381345 2570 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 23 08:14:44.419071 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.419038 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-5659b6b8f-zltpw"] Apr 23 08:14:44.444211 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.444163 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-9zlnz"] Apr 23 08:14:44.444396 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.444341 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5659b6b8f-zltpw" Apr 23 08:14:44.447392 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.447364 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 23 08:14:44.447535 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.447465 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 23 08:14:44.447830 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.447810 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-fw7pg\"" Apr 23 08:14:44.447956 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.447932 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 23 08:14:44.452606 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.452585 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 23 08:14:44.464139 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.464114 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-57cffdbbd5-5knwz"] Apr 23 08:14:44.464349 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.464328 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-9zlnz" Apr 23 08:14:44.466399 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.466377 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 23 08:14:44.466518 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.466468 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-gvqnd\"" Apr 23 08:14:44.466613 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.466592 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 23 08:14:44.483426 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.483399 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b8989dcd8-6k676"] Apr 23 08:14:44.483592 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.483573 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-57cffdbbd5-5knwz" Apr 23 08:14:44.485848 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.485825 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 23 08:14:44.485848 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.485837 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 23 08:14:44.486118 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.486094 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 23 08:14:44.486449 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.486432 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 23 08:14:44.486555 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.486479 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-8rdrb\"" Apr 23 08:14:44.496827 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.496800 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-85d9b8f75b-pzlv2"] Apr 23 08:14:44.496967 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.496877 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b8989dcd8-6k676" Apr 23 08:14:44.499513 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.499493 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 23 08:14:44.499645 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.499493 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 23 08:14:44.499645 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.499558 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 23 08:14:44.499853 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.499833 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 23 08:14:44.512783 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.512703 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5659b6b8f-zltpw"] Apr 23 08:14:44.512783 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.512733 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-9zlnz"] Apr 23 08:14:44.512783 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.512750 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-pmw99"] Apr 23 08:14:44.513031 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.512857 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-85d9b8f75b-pzlv2" Apr 23 08:14:44.515287 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.515264 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 23 08:14:44.530306 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.530216 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b8989dcd8-6k676"] Apr 23 08:14:44.530306 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.530245 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-57cffdbbd5-5knwz"] Apr 23 08:14:44.530306 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.530258 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-pmw99"] Apr 23 08:14:44.530306 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.530269 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-85d9b8f75b-pzlv2"] Apr 23 08:14:44.530635 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.530344 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-pmw99" Apr 23 08:14:44.532789 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.532769 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-fkwh7\"" Apr 23 08:14:44.532789 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.532784 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 23 08:14:44.532950 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.532825 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 23 08:14:44.532950 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.532877 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 23 08:14:44.536272 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.536252 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-2lvrc"] Apr 23 08:14:44.544391 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.544364 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ee86940d-8ae8-4a41-80ec-b0743181280c-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-9zlnz\" (UID: \"ee86940d-8ae8-4a41-80ec-b0743181280c\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-9zlnz" Apr 23 08:14:44.544518 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.544406 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b721afd1-8a2d-49cf-9b46-55d368ce0ed8-image-registry-private-configuration\") pod \"image-registry-5659b6b8f-zltpw\" (UID: \"b721afd1-8a2d-49cf-9b46-55d368ce0ed8\") " pod="openshift-image-registry/image-registry-5659b6b8f-zltpw" Apr 23 08:14:44.544518 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.544437 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b721afd1-8a2d-49cf-9b46-55d368ce0ed8-bound-sa-token\") pod \"image-registry-5659b6b8f-zltpw\" (UID: \"b721afd1-8a2d-49cf-9b46-55d368ce0ed8\") " pod="openshift-image-registry/image-registry-5659b6b8f-zltpw" Apr 23 08:14:44.544518 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.544466 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c299t\" (UniqueName: \"kubernetes.io/projected/b721afd1-8a2d-49cf-9b46-55d368ce0ed8-kube-api-access-c299t\") pod \"image-registry-5659b6b8f-zltpw\" (UID: \"b721afd1-8a2d-49cf-9b46-55d368ce0ed8\") " pod="openshift-image-registry/image-registry-5659b6b8f-zltpw" Apr 23 08:14:44.544669 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.544525 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b721afd1-8a2d-49cf-9b46-55d368ce0ed8-ca-trust-extracted\") pod \"image-registry-5659b6b8f-zltpw\" (UID: \"b721afd1-8a2d-49cf-9b46-55d368ce0ed8\") " pod="openshift-image-registry/image-registry-5659b6b8f-zltpw" Apr 23 08:14:44.544669 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.544553 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b721afd1-8a2d-49cf-9b46-55d368ce0ed8-installation-pull-secrets\") pod \"image-registry-5659b6b8f-zltpw\" (UID: \"b721afd1-8a2d-49cf-9b46-55d368ce0ed8\") " pod="openshift-image-registry/image-registry-5659b6b8f-zltpw" Apr 23 08:14:44.544669 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.544581 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/ee86940d-8ae8-4a41-80ec-b0743181280c-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-9zlnz\" (UID: \"ee86940d-8ae8-4a41-80ec-b0743181280c\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-9zlnz" Apr 23 08:14:44.544669 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.544606 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b721afd1-8a2d-49cf-9b46-55d368ce0ed8-registry-certificates\") pod \"image-registry-5659b6b8f-zltpw\" (UID: \"b721afd1-8a2d-49cf-9b46-55d368ce0ed8\") " pod="openshift-image-registry/image-registry-5659b6b8f-zltpw" Apr 23 08:14:44.544669 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.544629 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b721afd1-8a2d-49cf-9b46-55d368ce0ed8-trusted-ca\") pod \"image-registry-5659b6b8f-zltpw\" (UID: \"b721afd1-8a2d-49cf-9b46-55d368ce0ed8\") " pod="openshift-image-registry/image-registry-5659b6b8f-zltpw" Apr 23 08:14:44.544886 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.544674 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b721afd1-8a2d-49cf-9b46-55d368ce0ed8-registry-tls\") pod \"image-registry-5659b6b8f-zltpw\" (UID: \"b721afd1-8a2d-49cf-9b46-55d368ce0ed8\") " pod="openshift-image-registry/image-registry-5659b6b8f-zltpw" Apr 23 08:14:44.554750 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.554726 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-2lvrc"] Apr 23 08:14:44.554905 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.554854 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-2lvrc" Apr 23 08:14:44.557211 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.557012 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 23 08:14:44.557455 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.557429 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 23 08:14:44.557551 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.557464 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-f2cmm\"" Apr 23 08:14:44.585934 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.585894 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wrt6p" Apr 23 08:14:44.588292 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.588254 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 23 08:14:44.588684 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.588320 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-p5rcb\"" Apr 23 08:14:44.588684 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.588335 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 23 08:14:44.645390 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.645307 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvhwk\" (UniqueName: \"kubernetes.io/projected/bdabc8c7-71d8-4169-b0d7-8304d3f874d8-kube-api-access-pvhwk\") pod \"ingress-canary-pmw99\" (UID: \"bdabc8c7-71d8-4169-b0d7-8304d3f874d8\") " pod="openshift-ingress-canary/ingress-canary-pmw99" Apr 23 08:14:44.645390 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.645353 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b721afd1-8a2d-49cf-9b46-55d368ce0ed8-registry-tls\") pod \"image-registry-5659b6b8f-zltpw\" (UID: \"b721afd1-8a2d-49cf-9b46-55d368ce0ed8\") " pod="openshift-image-registry/image-registry-5659b6b8f-zltpw" Apr 23 08:14:44.645390 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.645378 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/f8e3f488-9aff-4f7a-a300-a03290cbf7ce-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-b8989dcd8-6k676\" (UID: \"f8e3f488-9aff-4f7a-a300-a03290cbf7ce\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b8989dcd8-6k676" Apr 23 08:14:44.645628 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.645447 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-276pn\" (UniqueName: \"kubernetes.io/projected/aab3e243-be52-4843-9a50-84da5fac7382-kube-api-access-276pn\") pod \"klusterlet-addon-workmgr-85d9b8f75b-pzlv2\" (UID: \"aab3e243-be52-4843-9a50-84da5fac7382\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-85d9b8f75b-pzlv2" Apr 23 08:14:44.645628 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:44.645487 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 08:14:44.645628 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.645496 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b721afd1-8a2d-49cf-9b46-55d368ce0ed8-image-registry-private-configuration\") pod \"image-registry-5659b6b8f-zltpw\" (UID: \"b721afd1-8a2d-49cf-9b46-55d368ce0ed8\") " pod="openshift-image-registry/image-registry-5659b6b8f-zltpw" Apr 23 08:14:44.645628 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.645542 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/76b08de2-6cb8-4320-bedc-cb3ca96031f5-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-57cffdbbd5-5knwz\" (UID: \"76b08de2-6cb8-4320-bedc-cb3ca96031f5\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-57cffdbbd5-5knwz" Apr 23 08:14:44.645628 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:44.645504 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5659b6b8f-zltpw: secret "image-registry-tls" not found Apr 23 08:14:44.645628 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.645568 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/52a0af2f-2394-4095-ac7f-6cc85c59c3a7-metrics-tls\") pod \"dns-default-2lvrc\" (UID: \"52a0af2f-2394-4095-ac7f-6cc85c59c3a7\") " pod="openshift-dns/dns-default-2lvrc" Apr 23 08:14:44.645628 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.645594 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/f8e3f488-9aff-4f7a-a300-a03290cbf7ce-ca\") pod \"cluster-proxy-proxy-agent-b8989dcd8-6k676\" (UID: \"f8e3f488-9aff-4f7a-a300-a03290cbf7ce\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b8989dcd8-6k676" Apr 23 08:14:44.645922 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:44.645644 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b721afd1-8a2d-49cf-9b46-55d368ce0ed8-registry-tls podName:b721afd1-8a2d-49cf-9b46-55d368ce0ed8 nodeName:}" failed. No retries permitted until 2026-04-23 08:14:45.14561803 +0000 UTC m=+34.093805258 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b721afd1-8a2d-49cf-9b46-55d368ce0ed8-registry-tls") pod "image-registry-5659b6b8f-zltpw" (UID: "b721afd1-8a2d-49cf-9b46-55d368ce0ed8") : secret "image-registry-tls" not found Apr 23 08:14:44.645922 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.645668 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c299t\" (UniqueName: \"kubernetes.io/projected/b721afd1-8a2d-49cf-9b46-55d368ce0ed8-kube-api-access-c299t\") pod \"image-registry-5659b6b8f-zltpw\" (UID: \"b721afd1-8a2d-49cf-9b46-55d368ce0ed8\") " pod="openshift-image-registry/image-registry-5659b6b8f-zltpw" Apr 23 08:14:44.645922 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.645701 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/52a0af2f-2394-4095-ac7f-6cc85c59c3a7-config-volume\") pod \"dns-default-2lvrc\" (UID: \"52a0af2f-2394-4095-ac7f-6cc85c59c3a7\") " pod="openshift-dns/dns-default-2lvrc" Apr 23 08:14:44.645922 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.645756 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsrmg\" (UniqueName: \"kubernetes.io/projected/f8e3f488-9aff-4f7a-a300-a03290cbf7ce-kube-api-access-qsrmg\") pod \"cluster-proxy-proxy-agent-b8989dcd8-6k676\" (UID: \"f8e3f488-9aff-4f7a-a300-a03290cbf7ce\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b8989dcd8-6k676" Apr 23 08:14:44.645922 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.645783 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/aab3e243-be52-4843-9a50-84da5fac7382-tmp\") pod \"klusterlet-addon-workmgr-85d9b8f75b-pzlv2\" (UID: \"aab3e243-be52-4843-9a50-84da5fac7382\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-85d9b8f75b-pzlv2" Apr 23 08:14:44.645922 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.645810 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbsjw\" (UniqueName: \"kubernetes.io/projected/52a0af2f-2394-4095-ac7f-6cc85c59c3a7-kube-api-access-nbsjw\") pod \"dns-default-2lvrc\" (UID: \"52a0af2f-2394-4095-ac7f-6cc85c59c3a7\") " pod="openshift-dns/dns-default-2lvrc" Apr 23 08:14:44.645922 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.645836 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bdabc8c7-71d8-4169-b0d7-8304d3f874d8-cert\") pod \"ingress-canary-pmw99\" (UID: \"bdabc8c7-71d8-4169-b0d7-8304d3f874d8\") " pod="openshift-ingress-canary/ingress-canary-pmw99" Apr 23 08:14:44.645922 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.645880 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/aab3e243-be52-4843-9a50-84da5fac7382-klusterlet-config\") pod \"klusterlet-addon-workmgr-85d9b8f75b-pzlv2\" (UID: \"aab3e243-be52-4843-9a50-84da5fac7382\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-85d9b8f75b-pzlv2" Apr 23 08:14:44.645922 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.645904 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/f8e3f488-9aff-4f7a-a300-a03290cbf7ce-hub\") pod \"cluster-proxy-proxy-agent-b8989dcd8-6k676\" (UID: \"f8e3f488-9aff-4f7a-a300-a03290cbf7ce\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b8989dcd8-6k676" Apr 23 08:14:44.646331 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.645975 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ee86940d-8ae8-4a41-80ec-b0743181280c-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-9zlnz\" (UID: \"ee86940d-8ae8-4a41-80ec-b0743181280c\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-9zlnz" Apr 23 08:14:44.646331 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.646012 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b721afd1-8a2d-49cf-9b46-55d368ce0ed8-bound-sa-token\") pod \"image-registry-5659b6b8f-zltpw\" (UID: \"b721afd1-8a2d-49cf-9b46-55d368ce0ed8\") " pod="openshift-image-registry/image-registry-5659b6b8f-zltpw" Apr 23 08:14:44.646331 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:44.646043 2570 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 23 08:14:44.646331 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.646040 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gm22t\" (UniqueName: \"kubernetes.io/projected/76b08de2-6cb8-4320-bedc-cb3ca96031f5-kube-api-access-gm22t\") pod \"managed-serviceaccount-addon-agent-57cffdbbd5-5knwz\" (UID: \"76b08de2-6cb8-4320-bedc-cb3ca96031f5\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-57cffdbbd5-5knwz" Apr 23 08:14:44.646331 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:44.646091 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ee86940d-8ae8-4a41-80ec-b0743181280c-networking-console-plugin-cert podName:ee86940d-8ae8-4a41-80ec-b0743181280c nodeName:}" failed. No retries permitted until 2026-04-23 08:14:45.146076504 +0000 UTC m=+34.094263726 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/ee86940d-8ae8-4a41-80ec-b0743181280c-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-9zlnz" (UID: "ee86940d-8ae8-4a41-80ec-b0743181280c") : secret "networking-console-plugin-cert" not found Apr 23 08:14:44.646331 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.646118 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/52a0af2f-2394-4095-ac7f-6cc85c59c3a7-tmp-dir\") pod \"dns-default-2lvrc\" (UID: \"52a0af2f-2394-4095-ac7f-6cc85c59c3a7\") " pod="openshift-dns/dns-default-2lvrc" Apr 23 08:14:44.646331 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.646159 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b721afd1-8a2d-49cf-9b46-55d368ce0ed8-ca-trust-extracted\") pod \"image-registry-5659b6b8f-zltpw\" (UID: \"b721afd1-8a2d-49cf-9b46-55d368ce0ed8\") " pod="openshift-image-registry/image-registry-5659b6b8f-zltpw" Apr 23 08:14:44.646331 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.646212 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b721afd1-8a2d-49cf-9b46-55d368ce0ed8-installation-pull-secrets\") pod \"image-registry-5659b6b8f-zltpw\" (UID: \"b721afd1-8a2d-49cf-9b46-55d368ce0ed8\") " pod="openshift-image-registry/image-registry-5659b6b8f-zltpw" Apr 23 08:14:44.646331 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.646238 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/f8e3f488-9aff-4f7a-a300-a03290cbf7ce-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-b8989dcd8-6k676\" (UID: \"f8e3f488-9aff-4f7a-a300-a03290cbf7ce\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b8989dcd8-6k676" Apr 23 08:14:44.646331 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.646267 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/ee86940d-8ae8-4a41-80ec-b0743181280c-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-9zlnz\" (UID: \"ee86940d-8ae8-4a41-80ec-b0743181280c\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-9zlnz" Apr 23 08:14:44.646331 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.646292 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b721afd1-8a2d-49cf-9b46-55d368ce0ed8-registry-certificates\") pod \"image-registry-5659b6b8f-zltpw\" (UID: \"b721afd1-8a2d-49cf-9b46-55d368ce0ed8\") " pod="openshift-image-registry/image-registry-5659b6b8f-zltpw" Apr 23 08:14:44.646331 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.646318 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b721afd1-8a2d-49cf-9b46-55d368ce0ed8-trusted-ca\") pod \"image-registry-5659b6b8f-zltpw\" (UID: \"b721afd1-8a2d-49cf-9b46-55d368ce0ed8\") " pod="openshift-image-registry/image-registry-5659b6b8f-zltpw" Apr 23 08:14:44.646942 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.646346 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/f8e3f488-9aff-4f7a-a300-a03290cbf7ce-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-b8989dcd8-6k676\" (UID: \"f8e3f488-9aff-4f7a-a300-a03290cbf7ce\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b8989dcd8-6k676" Apr 23 08:14:44.646942 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.646597 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b721afd1-8a2d-49cf-9b46-55d368ce0ed8-ca-trust-extracted\") pod \"image-registry-5659b6b8f-zltpw\" (UID: \"b721afd1-8a2d-49cf-9b46-55d368ce0ed8\") " pod="openshift-image-registry/image-registry-5659b6b8f-zltpw" Apr 23 08:14:44.647038 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.646963 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b721afd1-8a2d-49cf-9b46-55d368ce0ed8-registry-certificates\") pod \"image-registry-5659b6b8f-zltpw\" (UID: \"b721afd1-8a2d-49cf-9b46-55d368ce0ed8\") " pod="openshift-image-registry/image-registry-5659b6b8f-zltpw" Apr 23 08:14:44.647038 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.646995 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/ee86940d-8ae8-4a41-80ec-b0743181280c-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-9zlnz\" (UID: \"ee86940d-8ae8-4a41-80ec-b0743181280c\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-9zlnz" Apr 23 08:14:44.647371 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.647334 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b721afd1-8a2d-49cf-9b46-55d368ce0ed8-trusted-ca\") pod \"image-registry-5659b6b8f-zltpw\" (UID: \"b721afd1-8a2d-49cf-9b46-55d368ce0ed8\") " pod="openshift-image-registry/image-registry-5659b6b8f-zltpw" Apr 23 08:14:44.650380 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.650355 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b721afd1-8a2d-49cf-9b46-55d368ce0ed8-installation-pull-secrets\") pod \"image-registry-5659b6b8f-zltpw\" (UID: \"b721afd1-8a2d-49cf-9b46-55d368ce0ed8\") " pod="openshift-image-registry/image-registry-5659b6b8f-zltpw" Apr 23 08:14:44.650380 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.650369 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b721afd1-8a2d-49cf-9b46-55d368ce0ed8-image-registry-private-configuration\") pod \"image-registry-5659b6b8f-zltpw\" (UID: \"b721afd1-8a2d-49cf-9b46-55d368ce0ed8\") " pod="openshift-image-registry/image-registry-5659b6b8f-zltpw" Apr 23 08:14:44.654648 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.654626 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c299t\" (UniqueName: \"kubernetes.io/projected/b721afd1-8a2d-49cf-9b46-55d368ce0ed8-kube-api-access-c299t\") pod \"image-registry-5659b6b8f-zltpw\" (UID: \"b721afd1-8a2d-49cf-9b46-55d368ce0ed8\") " pod="openshift-image-registry/image-registry-5659b6b8f-zltpw" Apr 23 08:14:44.654825 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.654793 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b721afd1-8a2d-49cf-9b46-55d368ce0ed8-bound-sa-token\") pod \"image-registry-5659b6b8f-zltpw\" (UID: \"b721afd1-8a2d-49cf-9b46-55d368ce0ed8\") " pod="openshift-image-registry/image-registry-5659b6b8f-zltpw" Apr 23 08:14:44.746969 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.746930 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/76b08de2-6cb8-4320-bedc-cb3ca96031f5-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-57cffdbbd5-5knwz\" (UID: \"76b08de2-6cb8-4320-bedc-cb3ca96031f5\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-57cffdbbd5-5knwz" Apr 23 08:14:44.746969 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.746978 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/52a0af2f-2394-4095-ac7f-6cc85c59c3a7-metrics-tls\") pod \"dns-default-2lvrc\" (UID: \"52a0af2f-2394-4095-ac7f-6cc85c59c3a7\") " pod="openshift-dns/dns-default-2lvrc" Apr 23 08:14:44.747261 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.746997 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/f8e3f488-9aff-4f7a-a300-a03290cbf7ce-ca\") pod \"cluster-proxy-proxy-agent-b8989dcd8-6k676\" (UID: \"f8e3f488-9aff-4f7a-a300-a03290cbf7ce\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b8989dcd8-6k676" Apr 23 08:14:44.747261 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.747013 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/52a0af2f-2394-4095-ac7f-6cc85c59c3a7-config-volume\") pod \"dns-default-2lvrc\" (UID: \"52a0af2f-2394-4095-ac7f-6cc85c59c3a7\") " pod="openshift-dns/dns-default-2lvrc" Apr 23 08:14:44.747261 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.747072 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qsrmg\" (UniqueName: \"kubernetes.io/projected/f8e3f488-9aff-4f7a-a300-a03290cbf7ce-kube-api-access-qsrmg\") pod \"cluster-proxy-proxy-agent-b8989dcd8-6k676\" (UID: \"f8e3f488-9aff-4f7a-a300-a03290cbf7ce\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b8989dcd8-6k676" Apr 23 08:14:44.747261 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.747090 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/aab3e243-be52-4843-9a50-84da5fac7382-tmp\") pod \"klusterlet-addon-workmgr-85d9b8f75b-pzlv2\" (UID: \"aab3e243-be52-4843-9a50-84da5fac7382\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-85d9b8f75b-pzlv2" Apr 23 08:14:44.747261 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:44.747114 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 08:14:44.747261 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.747128 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nbsjw\" (UniqueName: \"kubernetes.io/projected/52a0af2f-2394-4095-ac7f-6cc85c59c3a7-kube-api-access-nbsjw\") pod \"dns-default-2lvrc\" (UID: \"52a0af2f-2394-4095-ac7f-6cc85c59c3a7\") " pod="openshift-dns/dns-default-2lvrc" Apr 23 08:14:44.747261 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.747157 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bdabc8c7-71d8-4169-b0d7-8304d3f874d8-cert\") pod \"ingress-canary-pmw99\" (UID: \"bdabc8c7-71d8-4169-b0d7-8304d3f874d8\") " pod="openshift-ingress-canary/ingress-canary-pmw99" Apr 23 08:14:44.747261 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.747185 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/aab3e243-be52-4843-9a50-84da5fac7382-klusterlet-config\") pod \"klusterlet-addon-workmgr-85d9b8f75b-pzlv2\" (UID: \"aab3e243-be52-4843-9a50-84da5fac7382\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-85d9b8f75b-pzlv2" Apr 23 08:14:44.747261 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.747254 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/f8e3f488-9aff-4f7a-a300-a03290cbf7ce-hub\") pod \"cluster-proxy-proxy-agent-b8989dcd8-6k676\" (UID: \"f8e3f488-9aff-4f7a-a300-a03290cbf7ce\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b8989dcd8-6k676" Apr 23 08:14:44.747708 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.747325 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gm22t\" (UniqueName: \"kubernetes.io/projected/76b08de2-6cb8-4320-bedc-cb3ca96031f5-kube-api-access-gm22t\") pod \"managed-serviceaccount-addon-agent-57cffdbbd5-5knwz\" (UID: \"76b08de2-6cb8-4320-bedc-cb3ca96031f5\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-57cffdbbd5-5knwz" Apr 23 08:14:44.747708 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.747370 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/52a0af2f-2394-4095-ac7f-6cc85c59c3a7-tmp-dir\") pod \"dns-default-2lvrc\" (UID: \"52a0af2f-2394-4095-ac7f-6cc85c59c3a7\") " pod="openshift-dns/dns-default-2lvrc" Apr 23 08:14:44.747708 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.747416 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/f8e3f488-9aff-4f7a-a300-a03290cbf7ce-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-b8989dcd8-6k676\" (UID: \"f8e3f488-9aff-4f7a-a300-a03290cbf7ce\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b8989dcd8-6k676" Apr 23 08:14:44.747708 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.747449 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/f8e3f488-9aff-4f7a-a300-a03290cbf7ce-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-b8989dcd8-6k676\" (UID: \"f8e3f488-9aff-4f7a-a300-a03290cbf7ce\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b8989dcd8-6k676" Apr 23 08:14:44.747708 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:44.747481 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 08:14:44.747708 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:44.747546 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bdabc8c7-71d8-4169-b0d7-8304d3f874d8-cert podName:bdabc8c7-71d8-4169-b0d7-8304d3f874d8 nodeName:}" failed. No retries permitted until 2026-04-23 08:14:45.247526925 +0000 UTC m=+34.195714135 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bdabc8c7-71d8-4169-b0d7-8304d3f874d8-cert") pod "ingress-canary-pmw99" (UID: "bdabc8c7-71d8-4169-b0d7-8304d3f874d8") : secret "canary-serving-cert" not found Apr 23 08:14:44.747708 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.747489 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pvhwk\" (UniqueName: \"kubernetes.io/projected/bdabc8c7-71d8-4169-b0d7-8304d3f874d8-kube-api-access-pvhwk\") pod \"ingress-canary-pmw99\" (UID: \"bdabc8c7-71d8-4169-b0d7-8304d3f874d8\") " pod="openshift-ingress-canary/ingress-canary-pmw99" Apr 23 08:14:44.748057 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.747728 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/f8e3f488-9aff-4f7a-a300-a03290cbf7ce-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-b8989dcd8-6k676\" (UID: \"f8e3f488-9aff-4f7a-a300-a03290cbf7ce\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b8989dcd8-6k676" Apr 23 08:14:44.748057 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.747740 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/aab3e243-be52-4843-9a50-84da5fac7382-tmp\") pod \"klusterlet-addon-workmgr-85d9b8f75b-pzlv2\" (UID: \"aab3e243-be52-4843-9a50-84da5fac7382\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-85d9b8f75b-pzlv2" Apr 23 08:14:44.748057 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.747760 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-276pn\" (UniqueName: \"kubernetes.io/projected/aab3e243-be52-4843-9a50-84da5fac7382-kube-api-access-276pn\") pod \"klusterlet-addon-workmgr-85d9b8f75b-pzlv2\" (UID: \"aab3e243-be52-4843-9a50-84da5fac7382\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-85d9b8f75b-pzlv2" Apr 23 08:14:44.748057 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:44.747783 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/52a0af2f-2394-4095-ac7f-6cc85c59c3a7-metrics-tls podName:52a0af2f-2394-4095-ac7f-6cc85c59c3a7 nodeName:}" failed. No retries permitted until 2026-04-23 08:14:45.247768837 +0000 UTC m=+34.195956044 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/52a0af2f-2394-4095-ac7f-6cc85c59c3a7-metrics-tls") pod "dns-default-2lvrc" (UID: "52a0af2f-2394-4095-ac7f-6cc85c59c3a7") : secret "dns-default-metrics-tls" not found Apr 23 08:14:44.748284 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.748083 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/52a0af2f-2394-4095-ac7f-6cc85c59c3a7-tmp-dir\") pod \"dns-default-2lvrc\" (UID: \"52a0af2f-2394-4095-ac7f-6cc85c59c3a7\") " pod="openshift-dns/dns-default-2lvrc" Apr 23 08:14:44.748343 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.748289 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/52a0af2f-2394-4095-ac7f-6cc85c59c3a7-config-volume\") pod \"dns-default-2lvrc\" (UID: \"52a0af2f-2394-4095-ac7f-6cc85c59c3a7\") " pod="openshift-dns/dns-default-2lvrc" Apr 23 08:14:44.748712 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.748670 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/f8e3f488-9aff-4f7a-a300-a03290cbf7ce-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-b8989dcd8-6k676\" (UID: \"f8e3f488-9aff-4f7a-a300-a03290cbf7ce\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b8989dcd8-6k676" Apr 23 08:14:44.750378 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.750332 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/f8e3f488-9aff-4f7a-a300-a03290cbf7ce-ca\") pod \"cluster-proxy-proxy-agent-b8989dcd8-6k676\" (UID: \"f8e3f488-9aff-4f7a-a300-a03290cbf7ce\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b8989dcd8-6k676" Apr 23 08:14:44.750876 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.750850 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/76b08de2-6cb8-4320-bedc-cb3ca96031f5-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-57cffdbbd5-5knwz\" (UID: \"76b08de2-6cb8-4320-bedc-cb3ca96031f5\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-57cffdbbd5-5knwz" Apr 23 08:14:44.750996 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.750891 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/f8e3f488-9aff-4f7a-a300-a03290cbf7ce-hub\") pod \"cluster-proxy-proxy-agent-b8989dcd8-6k676\" (UID: \"f8e3f488-9aff-4f7a-a300-a03290cbf7ce\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b8989dcd8-6k676" Apr 23 08:14:44.751061 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.750994 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/f8e3f488-9aff-4f7a-a300-a03290cbf7ce-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-b8989dcd8-6k676\" (UID: \"f8e3f488-9aff-4f7a-a300-a03290cbf7ce\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b8989dcd8-6k676" Apr 23 08:14:44.751061 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.751028 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/aab3e243-be52-4843-9a50-84da5fac7382-klusterlet-config\") pod \"klusterlet-addon-workmgr-85d9b8f75b-pzlv2\" (UID: \"aab3e243-be52-4843-9a50-84da5fac7382\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-85d9b8f75b-pzlv2" Apr 23 08:14:44.751442 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.751419 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/f8e3f488-9aff-4f7a-a300-a03290cbf7ce-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-b8989dcd8-6k676\" (UID: \"f8e3f488-9aff-4f7a-a300-a03290cbf7ce\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b8989dcd8-6k676" Apr 23 08:14:44.758754 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.758726 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gm22t\" (UniqueName: \"kubernetes.io/projected/76b08de2-6cb8-4320-bedc-cb3ca96031f5-kube-api-access-gm22t\") pod \"managed-serviceaccount-addon-agent-57cffdbbd5-5knwz\" (UID: \"76b08de2-6cb8-4320-bedc-cb3ca96031f5\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-57cffdbbd5-5knwz" Apr 23 08:14:44.758926 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.758907 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsrmg\" (UniqueName: \"kubernetes.io/projected/f8e3f488-9aff-4f7a-a300-a03290cbf7ce-kube-api-access-qsrmg\") pod \"cluster-proxy-proxy-agent-b8989dcd8-6k676\" (UID: \"f8e3f488-9aff-4f7a-a300-a03290cbf7ce\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b8989dcd8-6k676" Apr 23 08:14:44.759884 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.759862 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-276pn\" (UniqueName: \"kubernetes.io/projected/aab3e243-be52-4843-9a50-84da5fac7382-kube-api-access-276pn\") pod \"klusterlet-addon-workmgr-85d9b8f75b-pzlv2\" (UID: \"aab3e243-be52-4843-9a50-84da5fac7382\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-85d9b8f75b-pzlv2" Apr 23 08:14:44.766649 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.766627 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvhwk\" (UniqueName: \"kubernetes.io/projected/bdabc8c7-71d8-4169-b0d7-8304d3f874d8-kube-api-access-pvhwk\") pod \"ingress-canary-pmw99\" (UID: \"bdabc8c7-71d8-4169-b0d7-8304d3f874d8\") " pod="openshift-ingress-canary/ingress-canary-pmw99" Apr 23 08:14:44.769047 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.769028 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbsjw\" (UniqueName: \"kubernetes.io/projected/52a0af2f-2394-4095-ac7f-6cc85c59c3a7-kube-api-access-nbsjw\") pod \"dns-default-2lvrc\" (UID: \"52a0af2f-2394-4095-ac7f-6cc85c59c3a7\") " pod="openshift-dns/dns-default-2lvrc" Apr 23 08:14:44.808242 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.808172 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-57cffdbbd5-5knwz" Apr 23 08:14:44.817260 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.817230 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b8989dcd8-6k676" Apr 23 08:14:44.835062 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:44.835032 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-85d9b8f75b-pzlv2" Apr 23 08:14:45.152271 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:45.152232 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b721afd1-8a2d-49cf-9b46-55d368ce0ed8-registry-tls\") pod \"image-registry-5659b6b8f-zltpw\" (UID: \"b721afd1-8a2d-49cf-9b46-55d368ce0ed8\") " pod="openshift-image-registry/image-registry-5659b6b8f-zltpw" Apr 23 08:14:45.152518 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:45.152370 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ee86940d-8ae8-4a41-80ec-b0743181280c-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-9zlnz\" (UID: \"ee86940d-8ae8-4a41-80ec-b0743181280c\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-9zlnz" Apr 23 08:14:45.152518 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:45.152404 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 08:14:45.152518 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:45.152425 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5659b6b8f-zltpw: secret "image-registry-tls" not found Apr 23 08:14:45.152518 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:45.152492 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b721afd1-8a2d-49cf-9b46-55d368ce0ed8-registry-tls podName:b721afd1-8a2d-49cf-9b46-55d368ce0ed8 nodeName:}" failed. No retries permitted until 2026-04-23 08:14:46.152475349 +0000 UTC m=+35.100662557 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b721afd1-8a2d-49cf-9b46-55d368ce0ed8-registry-tls") pod "image-registry-5659b6b8f-zltpw" (UID: "b721afd1-8a2d-49cf-9b46-55d368ce0ed8") : secret "image-registry-tls" not found Apr 23 08:14:45.152518 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:45.152503 2570 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 23 08:14:45.152759 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:45.152568 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ee86940d-8ae8-4a41-80ec-b0743181280c-networking-console-plugin-cert podName:ee86940d-8ae8-4a41-80ec-b0743181280c nodeName:}" failed. No retries permitted until 2026-04-23 08:14:46.152550912 +0000 UTC m=+35.100738138 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/ee86940d-8ae8-4a41-80ec-b0743181280c-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-9zlnz" (UID: "ee86940d-8ae8-4a41-80ec-b0743181280c") : secret "networking-console-plugin-cert" not found Apr 23 08:14:45.253432 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:45.253388 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/52a0af2f-2394-4095-ac7f-6cc85c59c3a7-metrics-tls\") pod \"dns-default-2lvrc\" (UID: \"52a0af2f-2394-4095-ac7f-6cc85c59c3a7\") " pod="openshift-dns/dns-default-2lvrc" Apr 23 08:14:45.253630 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:45.253468 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bdabc8c7-71d8-4169-b0d7-8304d3f874d8-cert\") pod \"ingress-canary-pmw99\" (UID: \"bdabc8c7-71d8-4169-b0d7-8304d3f874d8\") " pod="openshift-ingress-canary/ingress-canary-pmw99" Apr 23 08:14:45.253630 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:45.253595 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 08:14:45.253726 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:45.253631 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5e2ddc62-96a8-4827-87cc-a0b37c9a5e9f-metrics-certs\") pod \"network-metrics-daemon-4gzjb\" (UID: \"5e2ddc62-96a8-4827-87cc-a0b37c9a5e9f\") " pod="openshift-multus/network-metrics-daemon-4gzjb" Apr 23 08:14:45.253726 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:45.253597 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 08:14:45.253726 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:45.253670 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bdabc8c7-71d8-4169-b0d7-8304d3f874d8-cert podName:bdabc8c7-71d8-4169-b0d7-8304d3f874d8 nodeName:}" failed. No retries permitted until 2026-04-23 08:14:46.2536501 +0000 UTC m=+35.201837310 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bdabc8c7-71d8-4169-b0d7-8304d3f874d8-cert") pod "ingress-canary-pmw99" (UID: "bdabc8c7-71d8-4169-b0d7-8304d3f874d8") : secret "canary-serving-cert" not found Apr 23 08:14:45.253726 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:45.253707 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:14:45.253726 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:45.253716 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/52a0af2f-2394-4095-ac7f-6cc85c59c3a7-metrics-tls podName:52a0af2f-2394-4095-ac7f-6cc85c59c3a7 nodeName:}" failed. No retries permitted until 2026-04-23 08:14:46.253699086 +0000 UTC m=+35.201886309 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/52a0af2f-2394-4095-ac7f-6cc85c59c3a7-metrics-tls") pod "dns-default-2lvrc" (UID: "52a0af2f-2394-4095-ac7f-6cc85c59c3a7") : secret "dns-default-metrics-tls" not found Apr 23 08:14:45.254015 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:45.253758 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5e2ddc62-96a8-4827-87cc-a0b37c9a5e9f-metrics-certs podName:5e2ddc62-96a8-4827-87cc-a0b37c9a5e9f nodeName:}" failed. No retries permitted until 2026-04-23 08:15:17.25374676 +0000 UTC m=+66.201933974 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5e2ddc62-96a8-4827-87cc-a0b37c9a5e9f-metrics-certs") pod "network-metrics-daemon-4gzjb" (UID: "5e2ddc62-96a8-4827-87cc-a0b37c9a5e9f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:14:45.355341 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:45.355134 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tsgfg\" (UniqueName: \"kubernetes.io/projected/93ac6a8e-10aa-4687-be43-6d712bee9ebd-kube-api-access-tsgfg\") pod \"network-check-target-wrt6p\" (UID: \"93ac6a8e-10aa-4687-be43-6d712bee9ebd\") " pod="openshift-network-diagnostics/network-check-target-wrt6p" Apr 23 08:14:45.358236 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:45.358206 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsgfg\" (UniqueName: \"kubernetes.io/projected/93ac6a8e-10aa-4687-be43-6d712bee9ebd-kube-api-access-tsgfg\") pod \"network-check-target-wrt6p\" (UID: \"93ac6a8e-10aa-4687-be43-6d712bee9ebd\") " pod="openshift-network-diagnostics/network-check-target-wrt6p" Apr 23 08:14:45.500720 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:45.497437 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wrt6p" Apr 23 08:14:45.522816 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:45.522752 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-85d9b8f75b-pzlv2"] Apr 23 08:14:45.524214 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:45.523845 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b8989dcd8-6k676"] Apr 23 08:14:45.524771 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:45.524751 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-57cffdbbd5-5knwz"] Apr 23 08:14:45.588749 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:45.588723 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kssx7" Apr 23 08:14:45.589178 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:45.588724 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4gzjb" Apr 23 08:14:45.590894 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:45.590859 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 23 08:14:45.590977 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:45.590873 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-vjj4z\"" Apr 23 08:14:45.591019 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:45.591006 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 23 08:14:45.643050 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:45.643013 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod76b08de2_6cb8_4320_bedc_cb3ca96031f5.slice/crio-b6c59121e3318907f0e79a7aaf0d6e7c1975c12c8e2ce45c2bf49fb9b5b50f60 WatchSource:0}: Error finding container b6c59121e3318907f0e79a7aaf0d6e7c1975c12c8e2ce45c2bf49fb9b5b50f60: Status 404 returned error can't find the container with id b6c59121e3318907f0e79a7aaf0d6e7c1975c12c8e2ce45c2bf49fb9b5b50f60 Apr 23 08:14:45.643475 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:45.643443 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaab3e243_be52_4843_9a50_84da5fac7382.slice/crio-14b11afb1de22bf141ecc093b7b657d9b78ae91a413ac3dfbcfc4b8a29f666c7 WatchSource:0}: Error finding container 14b11afb1de22bf141ecc093b7b657d9b78ae91a413ac3dfbcfc4b8a29f666c7: Status 404 returned error can't find the container with id 14b11afb1de22bf141ecc093b7b657d9b78ae91a413ac3dfbcfc4b8a29f666c7 Apr 23 08:14:45.644081 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:45.644055 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8e3f488_9aff_4f7a_a300_a03290cbf7ce.slice/crio-72711ef95c93c796b451735eea5f192d7703c013c37b8514fc855505813a6313 WatchSource:0}: Error finding container 72711ef95c93c796b451735eea5f192d7703c013c37b8514fc855505813a6313: Status 404 returned error can't find the container with id 72711ef95c93c796b451735eea5f192d7703c013c37b8514fc855505813a6313 Apr 23 08:14:45.769973 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:45.769937 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b8989dcd8-6k676" event={"ID":"f8e3f488-9aff-4f7a-a300-a03290cbf7ce","Type":"ContainerStarted","Data":"72711ef95c93c796b451735eea5f192d7703c013c37b8514fc855505813a6313"} Apr 23 08:14:45.771986 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:45.771948 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-57cffdbbd5-5knwz" event={"ID":"76b08de2-6cb8-4320-bedc-cb3ca96031f5","Type":"ContainerStarted","Data":"b6c59121e3318907f0e79a7aaf0d6e7c1975c12c8e2ce45c2bf49fb9b5b50f60"} Apr 23 08:14:45.773692 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:45.773659 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-85d9b8f75b-pzlv2" event={"ID":"aab3e243-be52-4843-9a50-84da5fac7382","Type":"ContainerStarted","Data":"14b11afb1de22bf141ecc093b7b657d9b78ae91a413ac3dfbcfc4b8a29f666c7"} Apr 23 08:14:45.790020 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:45.789989 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-wrt6p"] Apr 23 08:14:45.795028 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:45.794990 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93ac6a8e_10aa_4687_be43_6d712bee9ebd.slice/crio-867e0ed84cc30b98d53c36ec1958e45b6fce12fba1ce691b8696cadff800a4f0 WatchSource:0}: Error finding container 867e0ed84cc30b98d53c36ec1958e45b6fce12fba1ce691b8696cadff800a4f0: Status 404 returned error can't find the container with id 867e0ed84cc30b98d53c36ec1958e45b6fce12fba1ce691b8696cadff800a4f0 Apr 23 08:14:46.163326 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:46.163286 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b721afd1-8a2d-49cf-9b46-55d368ce0ed8-registry-tls\") pod \"image-registry-5659b6b8f-zltpw\" (UID: \"b721afd1-8a2d-49cf-9b46-55d368ce0ed8\") " pod="openshift-image-registry/image-registry-5659b6b8f-zltpw" Apr 23 08:14:46.163554 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:46.163392 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ee86940d-8ae8-4a41-80ec-b0743181280c-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-9zlnz\" (UID: \"ee86940d-8ae8-4a41-80ec-b0743181280c\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-9zlnz" Apr 23 08:14:46.163605 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:46.163556 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 08:14:46.163605 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:46.163581 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5659b6b8f-zltpw: secret "image-registry-tls" not found Apr 23 08:14:46.163677 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:46.163630 2570 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 23 08:14:46.163677 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:46.163646 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b721afd1-8a2d-49cf-9b46-55d368ce0ed8-registry-tls podName:b721afd1-8a2d-49cf-9b46-55d368ce0ed8 nodeName:}" failed. No retries permitted until 2026-04-23 08:14:48.163624436 +0000 UTC m=+37.111811648 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b721afd1-8a2d-49cf-9b46-55d368ce0ed8-registry-tls") pod "image-registry-5659b6b8f-zltpw" (UID: "b721afd1-8a2d-49cf-9b46-55d368ce0ed8") : secret "image-registry-tls" not found Apr 23 08:14:46.163786 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:46.163693 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ee86940d-8ae8-4a41-80ec-b0743181280c-networking-console-plugin-cert podName:ee86940d-8ae8-4a41-80ec-b0743181280c nodeName:}" failed. No retries permitted until 2026-04-23 08:14:48.163676352 +0000 UTC m=+37.111863577 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/ee86940d-8ae8-4a41-80ec-b0743181280c-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-9zlnz" (UID: "ee86940d-8ae8-4a41-80ec-b0743181280c") : secret "networking-console-plugin-cert" not found Apr 23 08:14:46.264368 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:46.264275 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/52a0af2f-2394-4095-ac7f-6cc85c59c3a7-metrics-tls\") pod \"dns-default-2lvrc\" (UID: \"52a0af2f-2394-4095-ac7f-6cc85c59c3a7\") " pod="openshift-dns/dns-default-2lvrc" Apr 23 08:14:46.264368 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:46.264339 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bdabc8c7-71d8-4169-b0d7-8304d3f874d8-cert\") pod \"ingress-canary-pmw99\" (UID: \"bdabc8c7-71d8-4169-b0d7-8304d3f874d8\") " pod="openshift-ingress-canary/ingress-canary-pmw99" Apr 23 08:14:46.264602 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:46.264438 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 08:14:46.264602 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:46.264460 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 08:14:46.264602 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:46.264512 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/52a0af2f-2394-4095-ac7f-6cc85c59c3a7-metrics-tls podName:52a0af2f-2394-4095-ac7f-6cc85c59c3a7 nodeName:}" failed. No retries permitted until 2026-04-23 08:14:48.264491969 +0000 UTC m=+37.212679188 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/52a0af2f-2394-4095-ac7f-6cc85c59c3a7-metrics-tls") pod "dns-default-2lvrc" (UID: "52a0af2f-2394-4095-ac7f-6cc85c59c3a7") : secret "dns-default-metrics-tls" not found Apr 23 08:14:46.264602 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:46.264529 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bdabc8c7-71d8-4169-b0d7-8304d3f874d8-cert podName:bdabc8c7-71d8-4169-b0d7-8304d3f874d8 nodeName:}" failed. No retries permitted until 2026-04-23 08:14:48.264522297 +0000 UTC m=+37.212709504 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bdabc8c7-71d8-4169-b0d7-8304d3f874d8-cert") pod "ingress-canary-pmw99" (UID: "bdabc8c7-71d8-4169-b0d7-8304d3f874d8") : secret "canary-serving-cert" not found Apr 23 08:14:46.782241 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:46.782188 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-wrt6p" event={"ID":"93ac6a8e-10aa-4687-be43-6d712bee9ebd","Type":"ContainerStarted","Data":"867e0ed84cc30b98d53c36ec1958e45b6fce12fba1ce691b8696cadff800a4f0"} Apr 23 08:14:46.789003 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:46.788971 2570 generic.go:358] "Generic (PLEG): container finished" podID="575d4e03-f407-47ca-9efc-8c7bee335d30" containerID="f278f97602bcdeabea7b223e5a98d9bf99d98b66417667054a2370374cd7a2eb" exitCode=0 Apr 23 08:14:46.789144 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:46.789040 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rctnw" event={"ID":"575d4e03-f407-47ca-9efc-8c7bee335d30","Type":"ContainerDied","Data":"f278f97602bcdeabea7b223e5a98d9bf99d98b66417667054a2370374cd7a2eb"} Apr 23 08:14:47.796464 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:47.796103 2570 generic.go:358] "Generic (PLEG): container finished" podID="575d4e03-f407-47ca-9efc-8c7bee335d30" containerID="86df391b08cee7f20a463f406ab4dec98fad1cf4b9f93d05a554949751c6a1f4" exitCode=0 Apr 23 08:14:47.796464 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:47.796407 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rctnw" event={"ID":"575d4e03-f407-47ca-9efc-8c7bee335d30","Type":"ContainerDied","Data":"86df391b08cee7f20a463f406ab4dec98fad1cf4b9f93d05a554949751c6a1f4"} Apr 23 08:14:48.183339 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:48.183253 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ee86940d-8ae8-4a41-80ec-b0743181280c-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-9zlnz\" (UID: \"ee86940d-8ae8-4a41-80ec-b0743181280c\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-9zlnz" Apr 23 08:14:48.183501 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:48.183341 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b721afd1-8a2d-49cf-9b46-55d368ce0ed8-registry-tls\") pod \"image-registry-5659b6b8f-zltpw\" (UID: \"b721afd1-8a2d-49cf-9b46-55d368ce0ed8\") " pod="openshift-image-registry/image-registry-5659b6b8f-zltpw" Apr 23 08:14:48.183501 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:48.183478 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 08:14:48.183501 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:48.183494 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5659b6b8f-zltpw: secret "image-registry-tls" not found Apr 23 08:14:48.183652 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:48.183554 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b721afd1-8a2d-49cf-9b46-55d368ce0ed8-registry-tls podName:b721afd1-8a2d-49cf-9b46-55d368ce0ed8 nodeName:}" failed. No retries permitted until 2026-04-23 08:14:52.183535057 +0000 UTC m=+41.131722267 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b721afd1-8a2d-49cf-9b46-55d368ce0ed8-registry-tls") pod "image-registry-5659b6b8f-zltpw" (UID: "b721afd1-8a2d-49cf-9b46-55d368ce0ed8") : secret "image-registry-tls" not found Apr 23 08:14:48.184038 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:48.184015 2570 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 23 08:14:48.184155 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:48.184073 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ee86940d-8ae8-4a41-80ec-b0743181280c-networking-console-plugin-cert podName:ee86940d-8ae8-4a41-80ec-b0743181280c nodeName:}" failed. No retries permitted until 2026-04-23 08:14:52.184058525 +0000 UTC m=+41.132245733 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/ee86940d-8ae8-4a41-80ec-b0743181280c-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-9zlnz" (UID: "ee86940d-8ae8-4a41-80ec-b0743181280c") : secret "networking-console-plugin-cert" not found Apr 23 08:14:48.284522 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:48.283986 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/52a0af2f-2394-4095-ac7f-6cc85c59c3a7-metrics-tls\") pod \"dns-default-2lvrc\" (UID: \"52a0af2f-2394-4095-ac7f-6cc85c59c3a7\") " pod="openshift-dns/dns-default-2lvrc" Apr 23 08:14:48.284522 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:48.284043 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bdabc8c7-71d8-4169-b0d7-8304d3f874d8-cert\") pod \"ingress-canary-pmw99\" (UID: \"bdabc8c7-71d8-4169-b0d7-8304d3f874d8\") " pod="openshift-ingress-canary/ingress-canary-pmw99" Apr 23 08:14:48.284522 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:48.284207 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 08:14:48.284522 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:48.284275 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bdabc8c7-71d8-4169-b0d7-8304d3f874d8-cert podName:bdabc8c7-71d8-4169-b0d7-8304d3f874d8 nodeName:}" failed. No retries permitted until 2026-04-23 08:14:52.28425452 +0000 UTC m=+41.232441730 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bdabc8c7-71d8-4169-b0d7-8304d3f874d8-cert") pod "ingress-canary-pmw99" (UID: "bdabc8c7-71d8-4169-b0d7-8304d3f874d8") : secret "canary-serving-cert" not found Apr 23 08:14:48.284522 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:48.284384 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 08:14:48.284522 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:48.284436 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/52a0af2f-2394-4095-ac7f-6cc85c59c3a7-metrics-tls podName:52a0af2f-2394-4095-ac7f-6cc85c59c3a7 nodeName:}" failed. No retries permitted until 2026-04-23 08:14:52.284422954 +0000 UTC m=+41.232610171 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/52a0af2f-2394-4095-ac7f-6cc85c59c3a7-metrics-tls") pod "dns-default-2lvrc" (UID: "52a0af2f-2394-4095-ac7f-6cc85c59c3a7") : secret "dns-default-metrics-tls" not found Apr 23 08:14:52.219000 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:52.218957 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b721afd1-8a2d-49cf-9b46-55d368ce0ed8-registry-tls\") pod \"image-registry-5659b6b8f-zltpw\" (UID: \"b721afd1-8a2d-49cf-9b46-55d368ce0ed8\") " pod="openshift-image-registry/image-registry-5659b6b8f-zltpw" Apr 23 08:14:52.219635 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:52.219088 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ee86940d-8ae8-4a41-80ec-b0743181280c-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-9zlnz\" (UID: \"ee86940d-8ae8-4a41-80ec-b0743181280c\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-9zlnz" Apr 23 08:14:52.219635 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:52.219161 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 08:14:52.219635 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:52.219504 2570 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 23 08:14:52.219635 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:52.219594 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5659b6b8f-zltpw: secret "image-registry-tls" not found Apr 23 08:14:52.219845 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:52.219690 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ee86940d-8ae8-4a41-80ec-b0743181280c-networking-console-plugin-cert podName:ee86940d-8ae8-4a41-80ec-b0743181280c nodeName:}" failed. No retries permitted until 2026-04-23 08:15:00.219652462 +0000 UTC m=+49.167839670 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/ee86940d-8ae8-4a41-80ec-b0743181280c-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-9zlnz" (UID: "ee86940d-8ae8-4a41-80ec-b0743181280c") : secret "networking-console-plugin-cert" not found Apr 23 08:14:52.219845 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:52.219727 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b721afd1-8a2d-49cf-9b46-55d368ce0ed8-registry-tls podName:b721afd1-8a2d-49cf-9b46-55d368ce0ed8 nodeName:}" failed. No retries permitted until 2026-04-23 08:15:00.219703245 +0000 UTC m=+49.167890455 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b721afd1-8a2d-49cf-9b46-55d368ce0ed8-registry-tls") pod "image-registry-5659b6b8f-zltpw" (UID: "b721afd1-8a2d-49cf-9b46-55d368ce0ed8") : secret "image-registry-tls" not found Apr 23 08:14:52.320023 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:52.319982 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/52a0af2f-2394-4095-ac7f-6cc85c59c3a7-metrics-tls\") pod \"dns-default-2lvrc\" (UID: \"52a0af2f-2394-4095-ac7f-6cc85c59c3a7\") " pod="openshift-dns/dns-default-2lvrc" Apr 23 08:14:52.320241 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:52.320033 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bdabc8c7-71d8-4169-b0d7-8304d3f874d8-cert\") pod \"ingress-canary-pmw99\" (UID: \"bdabc8c7-71d8-4169-b0d7-8304d3f874d8\") " pod="openshift-ingress-canary/ingress-canary-pmw99" Apr 23 08:14:52.320241 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:52.320148 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 08:14:52.320241 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:52.320171 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 08:14:52.320241 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:52.320244 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/52a0af2f-2394-4095-ac7f-6cc85c59c3a7-metrics-tls podName:52a0af2f-2394-4095-ac7f-6cc85c59c3a7 nodeName:}" failed. No retries permitted until 2026-04-23 08:15:00.320222291 +0000 UTC m=+49.268409513 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/52a0af2f-2394-4095-ac7f-6cc85c59c3a7-metrics-tls") pod "dns-default-2lvrc" (UID: "52a0af2f-2394-4095-ac7f-6cc85c59c3a7") : secret "dns-default-metrics-tls" not found Apr 23 08:14:52.320434 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:14:52.320265 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bdabc8c7-71d8-4169-b0d7-8304d3f874d8-cert podName:bdabc8c7-71d8-4169-b0d7-8304d3f874d8 nodeName:}" failed. No retries permitted until 2026-04-23 08:15:00.32025619 +0000 UTC m=+49.268443408 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bdabc8c7-71d8-4169-b0d7-8304d3f874d8-cert") pod "ingress-canary-pmw99" (UID: "bdabc8c7-71d8-4169-b0d7-8304d3f874d8") : secret "canary-serving-cert" not found Apr 23 08:14:53.810368 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:53.810332 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-85d9b8f75b-pzlv2" event={"ID":"aab3e243-be52-4843-9a50-84da5fac7382","Type":"ContainerStarted","Data":"17dae5ea0c4f79226e5af3bfd5ba1ff6c85d02935a3d8eb4c176d23b78fa230e"} Apr 23 08:14:53.810854 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:53.810584 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-85d9b8f75b-pzlv2" Apr 23 08:14:53.811738 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:53.811706 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-wrt6p" event={"ID":"93ac6a8e-10aa-4687-be43-6d712bee9ebd","Type":"ContainerStarted","Data":"aae979133ee351655ae66588a0f9bc03ea759c83d34f15c337f1ff6cbe2395e8"} Apr 23 08:14:53.811870 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:53.811831 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-wrt6p" Apr 23 08:14:53.812360 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:53.812345 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-85d9b8f75b-pzlv2" Apr 23 08:14:53.812959 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:53.812933 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-57cffdbbd5-5knwz" event={"ID":"76b08de2-6cb8-4320-bedc-cb3ca96031f5","Type":"ContainerStarted","Data":"bd83c3fbc3ecb6d7152e300831fca6b6be7b9bcb8245a2e98d014da699363d50"} Apr 23 08:14:53.815657 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:53.815637 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rctnw" event={"ID":"575d4e03-f407-47ca-9efc-8c7bee335d30","Type":"ContainerStarted","Data":"264839f6ab4d3b462f466bc04bb864d6387cff7b989d81204c6a7ff43bf166a2"} Apr 23 08:14:53.816819 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:53.816802 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b8989dcd8-6k676" event={"ID":"f8e3f488-9aff-4f7a-a300-a03290cbf7ce","Type":"ContainerStarted","Data":"c567f6474357549c2ab70245e034d287a9f8420d3047b3bf7a70da86be83d191"} Apr 23 08:14:53.825043 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:53.824983 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-85d9b8f75b-pzlv2" podStartSLOduration=7.399258035 podStartE2EDuration="14.824972918s" podCreationTimestamp="2026-04-23 08:14:39 +0000 UTC" firstStartedPulling="2026-04-23 08:14:45.662431305 +0000 UTC m=+34.610618512" lastFinishedPulling="2026-04-23 08:14:53.088146185 +0000 UTC m=+42.036333395" observedRunningTime="2026-04-23 08:14:53.824329874 +0000 UTC m=+42.772517113" watchObservedRunningTime="2026-04-23 08:14:53.824972918 +0000 UTC m=+42.773160125" Apr 23 08:14:53.837340 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:53.837178 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-wrt6p" podStartSLOduration=35.535983016 podStartE2EDuration="42.837163581s" podCreationTimestamp="2026-04-23 08:14:11 +0000 UTC" firstStartedPulling="2026-04-23 08:14:45.79712391 +0000 UTC m=+34.745311121" lastFinishedPulling="2026-04-23 08:14:53.098304474 +0000 UTC m=+42.046491686" observedRunningTime="2026-04-23 08:14:53.836332167 +0000 UTC m=+42.784519396" watchObservedRunningTime="2026-04-23 08:14:53.837163581 +0000 UTC m=+42.785350813" Apr 23 08:14:53.856293 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:53.856258 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-rctnw" podStartSLOduration=11.347128612 podStartE2EDuration="42.856246946s" podCreationTimestamp="2026-04-23 08:14:11 +0000 UTC" firstStartedPulling="2026-04-23 08:14:14.186021392 +0000 UTC m=+3.134208602" lastFinishedPulling="2026-04-23 08:14:45.695139728 +0000 UTC m=+34.643326936" observedRunningTime="2026-04-23 08:14:53.855752589 +0000 UTC m=+42.803939817" watchObservedRunningTime="2026-04-23 08:14:53.856246946 +0000 UTC m=+42.804434152" Apr 23 08:14:53.882308 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:53.882265 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-57cffdbbd5-5knwz" podStartSLOduration=7.456749886 podStartE2EDuration="14.882253172s" podCreationTimestamp="2026-04-23 08:14:39 +0000 UTC" firstStartedPulling="2026-04-23 08:14:45.662641401 +0000 UTC m=+34.610828618" lastFinishedPulling="2026-04-23 08:14:53.08814468 +0000 UTC m=+42.036331904" observedRunningTime="2026-04-23 08:14:53.881998747 +0000 UTC m=+42.830185975" watchObservedRunningTime="2026-04-23 08:14:53.882253172 +0000 UTC m=+42.830440444" Apr 23 08:14:54.837769 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:54.837733 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/48ea9757-031f-4dbe-bd48-c51513e48d24-original-pull-secret\") pod \"global-pull-secret-syncer-kssx7\" (UID: \"48ea9757-031f-4dbe-bd48-c51513e48d24\") " pod="kube-system/global-pull-secret-syncer-kssx7" Apr 23 08:14:54.841920 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:54.841885 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/48ea9757-031f-4dbe-bd48-c51513e48d24-original-pull-secret\") pod \"global-pull-secret-syncer-kssx7\" (UID: \"48ea9757-031f-4dbe-bd48-c51513e48d24\") " pod="kube-system/global-pull-secret-syncer-kssx7" Apr 23 08:14:54.898838 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:54.898803 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kssx7" Apr 23 08:14:55.010999 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:55.010964 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-kssx7"] Apr 23 08:14:55.589521 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:14:55.589491 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48ea9757_031f_4dbe_bd48_c51513e48d24.slice/crio-c578da4681e25264d6226752e06269ef428b9a663e5a924f7e34ad2e127a3f36 WatchSource:0}: Error finding container c578da4681e25264d6226752e06269ef428b9a663e5a924f7e34ad2e127a3f36: Status 404 returned error can't find the container with id c578da4681e25264d6226752e06269ef428b9a663e5a924f7e34ad2e127a3f36 Apr 23 08:14:55.821595 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:55.821556 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-kssx7" event={"ID":"48ea9757-031f-4dbe-bd48-c51513e48d24","Type":"ContainerStarted","Data":"c578da4681e25264d6226752e06269ef428b9a663e5a924f7e34ad2e127a3f36"} Apr 23 08:14:56.827546 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:56.827508 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b8989dcd8-6k676" event={"ID":"f8e3f488-9aff-4f7a-a300-a03290cbf7ce","Type":"ContainerStarted","Data":"a01146a026c89aa9ad5572fc87cc31c10061ff4c40dbda35b432205ecaeab247"} Apr 23 08:14:56.827546 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:56.827550 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b8989dcd8-6k676" event={"ID":"f8e3f488-9aff-4f7a-a300-a03290cbf7ce","Type":"ContainerStarted","Data":"24edb7a732f8a7095e4d57f37ccf5130d104e6c1e7131c2aa6fb366643da5b15"} Apr 23 08:14:56.844495 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:14:56.844437 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b8989dcd8-6k676" podStartSLOduration=7.5030514539999995 podStartE2EDuration="17.844420418s" podCreationTimestamp="2026-04-23 08:14:39 +0000 UTC" firstStartedPulling="2026-04-23 08:14:45.662532884 +0000 UTC m=+34.610720120" lastFinishedPulling="2026-04-23 08:14:56.003901873 +0000 UTC m=+44.952089084" observedRunningTime="2026-04-23 08:14:56.84325382 +0000 UTC m=+45.791441051" watchObservedRunningTime="2026-04-23 08:14:56.844420418 +0000 UTC m=+45.792607649" Apr 23 08:15:00.281833 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:15:00.281789 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ee86940d-8ae8-4a41-80ec-b0743181280c-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-9zlnz\" (UID: \"ee86940d-8ae8-4a41-80ec-b0743181280c\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-9zlnz" Apr 23 08:15:00.282321 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:15:00.281880 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b721afd1-8a2d-49cf-9b46-55d368ce0ed8-registry-tls\") pod \"image-registry-5659b6b8f-zltpw\" (UID: \"b721afd1-8a2d-49cf-9b46-55d368ce0ed8\") " pod="openshift-image-registry/image-registry-5659b6b8f-zltpw" Apr 23 08:15:00.282321 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:15:00.281992 2570 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 23 08:15:00.282321 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:15:00.282085 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ee86940d-8ae8-4a41-80ec-b0743181280c-networking-console-plugin-cert podName:ee86940d-8ae8-4a41-80ec-b0743181280c nodeName:}" failed. No retries permitted until 2026-04-23 08:15:16.282061855 +0000 UTC m=+65.230249077 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/ee86940d-8ae8-4a41-80ec-b0743181280c-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-9zlnz" (UID: "ee86940d-8ae8-4a41-80ec-b0743181280c") : secret "networking-console-plugin-cert" not found Apr 23 08:15:00.282321 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:15:00.282002 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 08:15:00.282321 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:15:00.282114 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5659b6b8f-zltpw: secret "image-registry-tls" not found Apr 23 08:15:00.282321 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:15:00.282227 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b721afd1-8a2d-49cf-9b46-55d368ce0ed8-registry-tls podName:b721afd1-8a2d-49cf-9b46-55d368ce0ed8 nodeName:}" failed. No retries permitted until 2026-04-23 08:15:16.282207321 +0000 UTC m=+65.230394544 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b721afd1-8a2d-49cf-9b46-55d368ce0ed8-registry-tls") pod "image-registry-5659b6b8f-zltpw" (UID: "b721afd1-8a2d-49cf-9b46-55d368ce0ed8") : secret "image-registry-tls" not found Apr 23 08:15:00.383212 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:15:00.383159 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bdabc8c7-71d8-4169-b0d7-8304d3f874d8-cert\") pod \"ingress-canary-pmw99\" (UID: \"bdabc8c7-71d8-4169-b0d7-8304d3f874d8\") " pod="openshift-ingress-canary/ingress-canary-pmw99" Apr 23 08:15:00.383395 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:15:00.383316 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/52a0af2f-2394-4095-ac7f-6cc85c59c3a7-metrics-tls\") pod \"dns-default-2lvrc\" (UID: \"52a0af2f-2394-4095-ac7f-6cc85c59c3a7\") " pod="openshift-dns/dns-default-2lvrc" Apr 23 08:15:00.383395 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:15:00.383315 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 08:15:00.383395 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:15:00.383368 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 08:15:00.383522 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:15:00.383402 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bdabc8c7-71d8-4169-b0d7-8304d3f874d8-cert podName:bdabc8c7-71d8-4169-b0d7-8304d3f874d8 nodeName:}" failed. No retries permitted until 2026-04-23 08:15:16.383384364 +0000 UTC m=+65.331571572 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bdabc8c7-71d8-4169-b0d7-8304d3f874d8-cert") pod "ingress-canary-pmw99" (UID: "bdabc8c7-71d8-4169-b0d7-8304d3f874d8") : secret "canary-serving-cert" not found Apr 23 08:15:00.383522 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:15:00.383421 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/52a0af2f-2394-4095-ac7f-6cc85c59c3a7-metrics-tls podName:52a0af2f-2394-4095-ac7f-6cc85c59c3a7 nodeName:}" failed. No retries permitted until 2026-04-23 08:15:16.383412472 +0000 UTC m=+65.331599682 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/52a0af2f-2394-4095-ac7f-6cc85c59c3a7-metrics-tls") pod "dns-default-2lvrc" (UID: "52a0af2f-2394-4095-ac7f-6cc85c59c3a7") : secret "dns-default-metrics-tls" not found Apr 23 08:15:01.848124 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:15:01.848089 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-kssx7" event={"ID":"48ea9757-031f-4dbe-bd48-c51513e48d24","Type":"ContainerStarted","Data":"500e79409cde5a03ee1a90888dd57bbb768c2d33834b4d955774baf736ad5eea"} Apr 23 08:15:01.861504 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:15:01.861452 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-kssx7" podStartSLOduration=34.580478249 podStartE2EDuration="39.861439237s" podCreationTimestamp="2026-04-23 08:14:22 +0000 UTC" firstStartedPulling="2026-04-23 08:14:55.591468051 +0000 UTC m=+44.539655258" lastFinishedPulling="2026-04-23 08:15:00.872429037 +0000 UTC m=+49.820616246" observedRunningTime="2026-04-23 08:15:01.860814911 +0000 UTC m=+50.809002140" watchObservedRunningTime="2026-04-23 08:15:01.861439237 +0000 UTC m=+50.809626465" Apr 23 08:15:10.813212 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:15:10.813166 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vsvjc" Apr 23 08:15:16.303732 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:15:16.303689 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b721afd1-8a2d-49cf-9b46-55d368ce0ed8-registry-tls\") pod \"image-registry-5659b6b8f-zltpw\" (UID: \"b721afd1-8a2d-49cf-9b46-55d368ce0ed8\") " pod="openshift-image-registry/image-registry-5659b6b8f-zltpw" Apr 23 08:15:16.304113 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:15:16.303795 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ee86940d-8ae8-4a41-80ec-b0743181280c-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-9zlnz\" (UID: \"ee86940d-8ae8-4a41-80ec-b0743181280c\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-9zlnz" Apr 23 08:15:16.304113 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:15:16.303859 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 08:15:16.304113 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:15:16.303877 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5659b6b8f-zltpw: secret "image-registry-tls" not found Apr 23 08:15:16.304113 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:15:16.303889 2570 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 23 08:15:16.304113 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:15:16.303937 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b721afd1-8a2d-49cf-9b46-55d368ce0ed8-registry-tls podName:b721afd1-8a2d-49cf-9b46-55d368ce0ed8 nodeName:}" failed. No retries permitted until 2026-04-23 08:15:48.303920962 +0000 UTC m=+97.252108176 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b721afd1-8a2d-49cf-9b46-55d368ce0ed8-registry-tls") pod "image-registry-5659b6b8f-zltpw" (UID: "b721afd1-8a2d-49cf-9b46-55d368ce0ed8") : secret "image-registry-tls" not found Apr 23 08:15:16.304113 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:15:16.303952 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ee86940d-8ae8-4a41-80ec-b0743181280c-networking-console-plugin-cert podName:ee86940d-8ae8-4a41-80ec-b0743181280c nodeName:}" failed. No retries permitted until 2026-04-23 08:15:48.303947285 +0000 UTC m=+97.252134491 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/ee86940d-8ae8-4a41-80ec-b0743181280c-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-9zlnz" (UID: "ee86940d-8ae8-4a41-80ec-b0743181280c") : secret "networking-console-plugin-cert" not found Apr 23 08:15:16.404350 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:15:16.404318 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/52a0af2f-2394-4095-ac7f-6cc85c59c3a7-metrics-tls\") pod \"dns-default-2lvrc\" (UID: \"52a0af2f-2394-4095-ac7f-6cc85c59c3a7\") " pod="openshift-dns/dns-default-2lvrc" Apr 23 08:15:16.404531 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:15:16.404370 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bdabc8c7-71d8-4169-b0d7-8304d3f874d8-cert\") pod \"ingress-canary-pmw99\" (UID: \"bdabc8c7-71d8-4169-b0d7-8304d3f874d8\") " pod="openshift-ingress-canary/ingress-canary-pmw99" Apr 23 08:15:16.404531 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:15:16.404464 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 08:15:16.404531 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:15:16.404510 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 08:15:16.404653 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:15:16.404550 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/52a0af2f-2394-4095-ac7f-6cc85c59c3a7-metrics-tls podName:52a0af2f-2394-4095-ac7f-6cc85c59c3a7 nodeName:}" failed. No retries permitted until 2026-04-23 08:15:48.404530871 +0000 UTC m=+97.352718096 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/52a0af2f-2394-4095-ac7f-6cc85c59c3a7-metrics-tls") pod "dns-default-2lvrc" (UID: "52a0af2f-2394-4095-ac7f-6cc85c59c3a7") : secret "dns-default-metrics-tls" not found Apr 23 08:15:16.404653 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:15:16.404564 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bdabc8c7-71d8-4169-b0d7-8304d3f874d8-cert podName:bdabc8c7-71d8-4169-b0d7-8304d3f874d8 nodeName:}" failed. No retries permitted until 2026-04-23 08:15:48.404558863 +0000 UTC m=+97.352746073 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bdabc8c7-71d8-4169-b0d7-8304d3f874d8-cert") pod "ingress-canary-pmw99" (UID: "bdabc8c7-71d8-4169-b0d7-8304d3f874d8") : secret "canary-serving-cert" not found Apr 23 08:15:17.312162 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:15:17.312118 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5e2ddc62-96a8-4827-87cc-a0b37c9a5e9f-metrics-certs\") pod \"network-metrics-daemon-4gzjb\" (UID: \"5e2ddc62-96a8-4827-87cc-a0b37c9a5e9f\") " pod="openshift-multus/network-metrics-daemon-4gzjb" Apr 23 08:15:17.314408 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:15:17.314387 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 23 08:15:17.322972 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:15:17.322953 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 23 08:15:17.323044 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:15:17.323009 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5e2ddc62-96a8-4827-87cc-a0b37c9a5e9f-metrics-certs podName:5e2ddc62-96a8-4827-87cc-a0b37c9a5e9f nodeName:}" failed. No retries permitted until 2026-04-23 08:16:21.322991971 +0000 UTC m=+130.271179177 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5e2ddc62-96a8-4827-87cc-a0b37c9a5e9f-metrics-certs") pod "network-metrics-daemon-4gzjb" (UID: "5e2ddc62-96a8-4827-87cc-a0b37c9a5e9f") : secret "metrics-daemon-secret" not found Apr 23 08:15:24.821629 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:15:24.821601 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-wrt6p" Apr 23 08:15:48.351706 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:15:48.351664 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b721afd1-8a2d-49cf-9b46-55d368ce0ed8-registry-tls\") pod \"image-registry-5659b6b8f-zltpw\" (UID: \"b721afd1-8a2d-49cf-9b46-55d368ce0ed8\") " pod="openshift-image-registry/image-registry-5659b6b8f-zltpw" Apr 23 08:15:48.352149 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:15:48.351777 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ee86940d-8ae8-4a41-80ec-b0743181280c-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-9zlnz\" (UID: \"ee86940d-8ae8-4a41-80ec-b0743181280c\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-9zlnz" Apr 23 08:15:48.352149 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:15:48.351823 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 08:15:48.352149 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:15:48.351839 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5659b6b8f-zltpw: secret "image-registry-tls" not found Apr 23 08:15:48.352149 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:15:48.351886 2570 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 23 08:15:48.352149 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:15:48.351935 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b721afd1-8a2d-49cf-9b46-55d368ce0ed8-registry-tls podName:b721afd1-8a2d-49cf-9b46-55d368ce0ed8 nodeName:}" failed. No retries permitted until 2026-04-23 08:16:52.351915328 +0000 UTC m=+161.300102549 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b721afd1-8a2d-49cf-9b46-55d368ce0ed8-registry-tls") pod "image-registry-5659b6b8f-zltpw" (UID: "b721afd1-8a2d-49cf-9b46-55d368ce0ed8") : secret "image-registry-tls" not found Apr 23 08:15:48.352149 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:15:48.351951 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ee86940d-8ae8-4a41-80ec-b0743181280c-networking-console-plugin-cert podName:ee86940d-8ae8-4a41-80ec-b0743181280c nodeName:}" failed. No retries permitted until 2026-04-23 08:16:52.351945028 +0000 UTC m=+161.300132234 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/ee86940d-8ae8-4a41-80ec-b0743181280c-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-9zlnz" (UID: "ee86940d-8ae8-4a41-80ec-b0743181280c") : secret "networking-console-plugin-cert" not found Apr 23 08:15:48.452258 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:15:48.452216 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/52a0af2f-2394-4095-ac7f-6cc85c59c3a7-metrics-tls\") pod \"dns-default-2lvrc\" (UID: \"52a0af2f-2394-4095-ac7f-6cc85c59c3a7\") " pod="openshift-dns/dns-default-2lvrc" Apr 23 08:15:48.452258 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:15:48.452267 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bdabc8c7-71d8-4169-b0d7-8304d3f874d8-cert\") pod \"ingress-canary-pmw99\" (UID: \"bdabc8c7-71d8-4169-b0d7-8304d3f874d8\") " pod="openshift-ingress-canary/ingress-canary-pmw99" Apr 23 08:15:48.452438 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:15:48.452362 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 08:15:48.452438 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:15:48.452400 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 08:15:48.452438 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:15:48.452433 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/52a0af2f-2394-4095-ac7f-6cc85c59c3a7-metrics-tls podName:52a0af2f-2394-4095-ac7f-6cc85c59c3a7 nodeName:}" failed. No retries permitted until 2026-04-23 08:16:52.452417695 +0000 UTC m=+161.400604901 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/52a0af2f-2394-4095-ac7f-6cc85c59c3a7-metrics-tls") pod "dns-default-2lvrc" (UID: "52a0af2f-2394-4095-ac7f-6cc85c59c3a7") : secret "dns-default-metrics-tls" not found Apr 23 08:15:48.452549 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:15:48.452448 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bdabc8c7-71d8-4169-b0d7-8304d3f874d8-cert podName:bdabc8c7-71d8-4169-b0d7-8304d3f874d8 nodeName:}" failed. No retries permitted until 2026-04-23 08:16:52.452441731 +0000 UTC m=+161.400628938 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bdabc8c7-71d8-4169-b0d7-8304d3f874d8-cert") pod "ingress-canary-pmw99" (UID: "bdabc8c7-71d8-4169-b0d7-8304d3f874d8") : secret "canary-serving-cert" not found Apr 23 08:16:21.402574 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:16:21.402535 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5e2ddc62-96a8-4827-87cc-a0b37c9a5e9f-metrics-certs\") pod \"network-metrics-daemon-4gzjb\" (UID: \"5e2ddc62-96a8-4827-87cc-a0b37c9a5e9f\") " pod="openshift-multus/network-metrics-daemon-4gzjb" Apr 23 08:16:21.403057 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:16:21.402683 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 23 08:16:21.403057 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:16:21.402749 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5e2ddc62-96a8-4827-87cc-a0b37c9a5e9f-metrics-certs podName:5e2ddc62-96a8-4827-87cc-a0b37c9a5e9f nodeName:}" failed. No retries permitted until 2026-04-23 08:18:23.402730928 +0000 UTC m=+252.350918156 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5e2ddc62-96a8-4827-87cc-a0b37c9a5e9f-metrics-certs") pod "network-metrics-daemon-4gzjb" (UID: "5e2ddc62-96a8-4827-87cc-a0b37c9a5e9f") : secret "metrics-daemon-secret" not found Apr 23 08:16:32.769362 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:16:32.769333 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-mslvd_9f4d9f29-8efc-4799-b966-20f9d049fd33/dns-node-resolver/0.log" Apr 23 08:16:33.969577 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:16:33.969549 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-fgmqg_749e4605-b37e-4004-a2dc-68092884ddae/node-ca/0.log" Apr 23 08:16:44.401624 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:16:44.401588 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-b2s4p"] Apr 23 08:16:44.403778 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:16:44.403761 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-b2s4p" Apr 23 08:16:44.406425 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:16:44.406396 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 23 08:16:44.406425 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:16:44.406414 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-npvv9\"" Apr 23 08:16:44.406596 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:16:44.406405 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 23 08:16:44.407040 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:16:44.407024 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 23 08:16:44.407086 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:16:44.407075 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 23 08:16:44.424995 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:16:44.424973 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-b2s4p"] Apr 23 08:16:44.475075 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:16:44.475021 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8q82\" (UniqueName: \"kubernetes.io/projected/054d81ed-b5d4-4a4d-b77f-47078e718470-kube-api-access-l8q82\") pod \"insights-runtime-extractor-b2s4p\" (UID: \"054d81ed-b5d4-4a4d-b77f-47078e718470\") " pod="openshift-insights/insights-runtime-extractor-b2s4p" Apr 23 08:16:44.475309 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:16:44.475220 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/054d81ed-b5d4-4a4d-b77f-47078e718470-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-b2s4p\" (UID: \"054d81ed-b5d4-4a4d-b77f-47078e718470\") " pod="openshift-insights/insights-runtime-extractor-b2s4p" Apr 23 08:16:44.475309 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:16:44.475258 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/054d81ed-b5d4-4a4d-b77f-47078e718470-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-b2s4p\" (UID: \"054d81ed-b5d4-4a4d-b77f-47078e718470\") " pod="openshift-insights/insights-runtime-extractor-b2s4p" Apr 23 08:16:44.475489 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:16:44.475333 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/054d81ed-b5d4-4a4d-b77f-47078e718470-crio-socket\") pod \"insights-runtime-extractor-b2s4p\" (UID: \"054d81ed-b5d4-4a4d-b77f-47078e718470\") " pod="openshift-insights/insights-runtime-extractor-b2s4p" Apr 23 08:16:44.475489 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:16:44.475380 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/054d81ed-b5d4-4a4d-b77f-47078e718470-data-volume\") pod \"insights-runtime-extractor-b2s4p\" (UID: \"054d81ed-b5d4-4a4d-b77f-47078e718470\") " pod="openshift-insights/insights-runtime-extractor-b2s4p" Apr 23 08:16:44.576013 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:16:44.575968 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/054d81ed-b5d4-4a4d-b77f-47078e718470-crio-socket\") pod \"insights-runtime-extractor-b2s4p\" (UID: \"054d81ed-b5d4-4a4d-b77f-47078e718470\") " pod="openshift-insights/insights-runtime-extractor-b2s4p" Apr 23 08:16:44.576184 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:16:44.576035 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/054d81ed-b5d4-4a4d-b77f-47078e718470-data-volume\") pod \"insights-runtime-extractor-b2s4p\" (UID: \"054d81ed-b5d4-4a4d-b77f-47078e718470\") " pod="openshift-insights/insights-runtime-extractor-b2s4p" Apr 23 08:16:44.576184 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:16:44.576092 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l8q82\" (UniqueName: \"kubernetes.io/projected/054d81ed-b5d4-4a4d-b77f-47078e718470-kube-api-access-l8q82\") pod \"insights-runtime-extractor-b2s4p\" (UID: \"054d81ed-b5d4-4a4d-b77f-47078e718470\") " pod="openshift-insights/insights-runtime-extractor-b2s4p" Apr 23 08:16:44.576184 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:16:44.576100 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/054d81ed-b5d4-4a4d-b77f-47078e718470-crio-socket\") pod \"insights-runtime-extractor-b2s4p\" (UID: \"054d81ed-b5d4-4a4d-b77f-47078e718470\") " pod="openshift-insights/insights-runtime-extractor-b2s4p" Apr 23 08:16:44.576184 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:16:44.576169 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/054d81ed-b5d4-4a4d-b77f-47078e718470-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-b2s4p\" (UID: \"054d81ed-b5d4-4a4d-b77f-47078e718470\") " pod="openshift-insights/insights-runtime-extractor-b2s4p" Apr 23 08:16:44.576346 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:16:44.576212 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/054d81ed-b5d4-4a4d-b77f-47078e718470-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-b2s4p\" (UID: \"054d81ed-b5d4-4a4d-b77f-47078e718470\") " pod="openshift-insights/insights-runtime-extractor-b2s4p" Apr 23 08:16:44.576346 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:16:44.576311 2570 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 23 08:16:44.576410 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:16:44.576382 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/054d81ed-b5d4-4a4d-b77f-47078e718470-insights-runtime-extractor-tls podName:054d81ed-b5d4-4a4d-b77f-47078e718470 nodeName:}" failed. No retries permitted until 2026-04-23 08:16:45.07636527 +0000 UTC m=+154.024552491 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/054d81ed-b5d4-4a4d-b77f-47078e718470-insights-runtime-extractor-tls") pod "insights-runtime-extractor-b2s4p" (UID: "054d81ed-b5d4-4a4d-b77f-47078e718470") : secret "insights-runtime-extractor-tls" not found Apr 23 08:16:44.576457 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:16:44.576408 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/054d81ed-b5d4-4a4d-b77f-47078e718470-data-volume\") pod \"insights-runtime-extractor-b2s4p\" (UID: \"054d81ed-b5d4-4a4d-b77f-47078e718470\") " pod="openshift-insights/insights-runtime-extractor-b2s4p" Apr 23 08:16:44.576809 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:16:44.576789 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/054d81ed-b5d4-4a4d-b77f-47078e718470-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-b2s4p\" (UID: \"054d81ed-b5d4-4a4d-b77f-47078e718470\") " pod="openshift-insights/insights-runtime-extractor-b2s4p" Apr 23 08:16:44.586321 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:16:44.586298 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8q82\" (UniqueName: \"kubernetes.io/projected/054d81ed-b5d4-4a4d-b77f-47078e718470-kube-api-access-l8q82\") pod \"insights-runtime-extractor-b2s4p\" (UID: \"054d81ed-b5d4-4a4d-b77f-47078e718470\") " pod="openshift-insights/insights-runtime-extractor-b2s4p" Apr 23 08:16:45.080239 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:16:45.080179 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/054d81ed-b5d4-4a4d-b77f-47078e718470-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-b2s4p\" (UID: \"054d81ed-b5d4-4a4d-b77f-47078e718470\") " pod="openshift-insights/insights-runtime-extractor-b2s4p" Apr 23 08:16:45.080413 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:16:45.080335 2570 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 23 08:16:45.080413 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:16:45.080408 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/054d81ed-b5d4-4a4d-b77f-47078e718470-insights-runtime-extractor-tls podName:054d81ed-b5d4-4a4d-b77f-47078e718470 nodeName:}" failed. No retries permitted until 2026-04-23 08:16:46.080390986 +0000 UTC m=+155.028578198 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/054d81ed-b5d4-4a4d-b77f-47078e718470-insights-runtime-extractor-tls") pod "insights-runtime-extractor-b2s4p" (UID: "054d81ed-b5d4-4a4d-b77f-47078e718470") : secret "insights-runtime-extractor-tls" not found Apr 23 08:16:46.088868 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:16:46.088836 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/054d81ed-b5d4-4a4d-b77f-47078e718470-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-b2s4p\" (UID: \"054d81ed-b5d4-4a4d-b77f-47078e718470\") " pod="openshift-insights/insights-runtime-extractor-b2s4p" Apr 23 08:16:46.089284 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:16:46.088958 2570 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 23 08:16:46.089284 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:16:46.089005 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/054d81ed-b5d4-4a4d-b77f-47078e718470-insights-runtime-extractor-tls podName:054d81ed-b5d4-4a4d-b77f-47078e718470 nodeName:}" failed. No retries permitted until 2026-04-23 08:16:48.08899149 +0000 UTC m=+157.037178697 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/054d81ed-b5d4-4a4d-b77f-47078e718470-insights-runtime-extractor-tls") pod "insights-runtime-extractor-b2s4p" (UID: "054d81ed-b5d4-4a4d-b77f-47078e718470") : secret "insights-runtime-extractor-tls" not found Apr 23 08:16:47.461732 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:16:47.461690 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-5659b6b8f-zltpw" podUID="b721afd1-8a2d-49cf-9b46-55d368ce0ed8" Apr 23 08:16:47.474843 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:16:47.474811 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-cb95c66f6-9zlnz" podUID="ee86940d-8ae8-4a41-80ec-b0743181280c" Apr 23 08:16:47.542838 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:16:47.542792 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-pmw99" podUID="bdabc8c7-71d8-4169-b0d7-8304d3f874d8" Apr 23 08:16:47.565996 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:16:47.565963 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-2lvrc" podUID="52a0af2f-2394-4095-ac7f-6cc85c59c3a7" Apr 23 08:16:48.095284 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:16:48.095250 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5659b6b8f-zltpw" Apr 23 08:16:48.095284 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:16:48.095280 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-pmw99" Apr 23 08:16:48.095475 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:16:48.095390 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-9zlnz" Apr 23 08:16:48.105873 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:16:48.105849 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/054d81ed-b5d4-4a4d-b77f-47078e718470-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-b2s4p\" (UID: \"054d81ed-b5d4-4a4d-b77f-47078e718470\") " pod="openshift-insights/insights-runtime-extractor-b2s4p" Apr 23 08:16:48.105987 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:16:48.105975 2570 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 23 08:16:48.106039 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:16:48.106030 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/054d81ed-b5d4-4a4d-b77f-47078e718470-insights-runtime-extractor-tls podName:054d81ed-b5d4-4a4d-b77f-47078e718470 nodeName:}" failed. No retries permitted until 2026-04-23 08:16:52.106015276 +0000 UTC m=+161.054202484 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/054d81ed-b5d4-4a4d-b77f-47078e718470-insights-runtime-extractor-tls") pod "insights-runtime-extractor-b2s4p" (UID: "054d81ed-b5d4-4a4d-b77f-47078e718470") : secret "insights-runtime-extractor-tls" not found Apr 23 08:16:48.603252 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:16:48.603209 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-4gzjb" podUID="5e2ddc62-96a8-4827-87cc-a0b37c9a5e9f" Apr 23 08:16:52.140603 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:16:52.140557 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/054d81ed-b5d4-4a4d-b77f-47078e718470-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-b2s4p\" (UID: \"054d81ed-b5d4-4a4d-b77f-47078e718470\") " pod="openshift-insights/insights-runtime-extractor-b2s4p" Apr 23 08:16:52.142833 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:16:52.142811 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/054d81ed-b5d4-4a4d-b77f-47078e718470-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-b2s4p\" (UID: \"054d81ed-b5d4-4a4d-b77f-47078e718470\") " pod="openshift-insights/insights-runtime-extractor-b2s4p" Apr 23 08:16:52.213034 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:16:52.212994 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-b2s4p" Apr 23 08:16:52.338937 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:16:52.338906 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-b2s4p"] Apr 23 08:16:52.342593 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:16:52.342564 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod054d81ed_b5d4_4a4d_b77f_47078e718470.slice/crio-98b023324706f1614c7b0cc36dbd57c252186b2650bdbb6ef9917a4afa050435 WatchSource:0}: Error finding container 98b023324706f1614c7b0cc36dbd57c252186b2650bdbb6ef9917a4afa050435: Status 404 returned error can't find the container with id 98b023324706f1614c7b0cc36dbd57c252186b2650bdbb6ef9917a4afa050435 Apr 23 08:16:52.443357 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:16:52.443312 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b721afd1-8a2d-49cf-9b46-55d368ce0ed8-registry-tls\") pod \"image-registry-5659b6b8f-zltpw\" (UID: \"b721afd1-8a2d-49cf-9b46-55d368ce0ed8\") " pod="openshift-image-registry/image-registry-5659b6b8f-zltpw" Apr 23 08:16:52.443510 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:16:52.443421 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ee86940d-8ae8-4a41-80ec-b0743181280c-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-9zlnz\" (UID: \"ee86940d-8ae8-4a41-80ec-b0743181280c\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-9zlnz" Apr 23 08:16:52.443557 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:16:52.443525 2570 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 23 08:16:52.443636 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:16:52.443624 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ee86940d-8ae8-4a41-80ec-b0743181280c-networking-console-plugin-cert podName:ee86940d-8ae8-4a41-80ec-b0743181280c nodeName:}" failed. No retries permitted until 2026-04-23 08:18:54.443566362 +0000 UTC m=+283.391753573 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/ee86940d-8ae8-4a41-80ec-b0743181280c-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-9zlnz" (UID: "ee86940d-8ae8-4a41-80ec-b0743181280c") : secret "networking-console-plugin-cert" not found Apr 23 08:16:52.445741 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:16:52.445709 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b721afd1-8a2d-49cf-9b46-55d368ce0ed8-registry-tls\") pod \"image-registry-5659b6b8f-zltpw\" (UID: \"b721afd1-8a2d-49cf-9b46-55d368ce0ed8\") " pod="openshift-image-registry/image-registry-5659b6b8f-zltpw" Apr 23 08:16:52.544295 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:16:52.544259 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bdabc8c7-71d8-4169-b0d7-8304d3f874d8-cert\") pod \"ingress-canary-pmw99\" (UID: \"bdabc8c7-71d8-4169-b0d7-8304d3f874d8\") " pod="openshift-ingress-canary/ingress-canary-pmw99" Apr 23 08:16:52.544462 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:16:52.544356 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/52a0af2f-2394-4095-ac7f-6cc85c59c3a7-metrics-tls\") pod \"dns-default-2lvrc\" (UID: \"52a0af2f-2394-4095-ac7f-6cc85c59c3a7\") " pod="openshift-dns/dns-default-2lvrc" Apr 23 08:16:52.546643 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:16:52.546613 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/52a0af2f-2394-4095-ac7f-6cc85c59c3a7-metrics-tls\") pod \"dns-default-2lvrc\" (UID: \"52a0af2f-2394-4095-ac7f-6cc85c59c3a7\") " pod="openshift-dns/dns-default-2lvrc" Apr 23 08:16:52.546643 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:16:52.546627 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bdabc8c7-71d8-4169-b0d7-8304d3f874d8-cert\") pod \"ingress-canary-pmw99\" (UID: \"bdabc8c7-71d8-4169-b0d7-8304d3f874d8\") " pod="openshift-ingress-canary/ingress-canary-pmw99" Apr 23 08:16:52.598474 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:16:52.598442 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-fw7pg\"" Apr 23 08:16:52.598474 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:16:52.598458 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-fkwh7\"" Apr 23 08:16:52.607006 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:16:52.606985 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5659b6b8f-zltpw" Apr 23 08:16:52.607133 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:16:52.607062 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-pmw99" Apr 23 08:16:52.726912 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:16:52.726882 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-pmw99"] Apr 23 08:16:52.729595 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:16:52.729562 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbdabc8c7_71d8_4169_b0d7_8304d3f874d8.slice/crio-badfa03be2d2f1cfbf708f2732c3943530c6738886b7804119f8e1de7eafd4f8 WatchSource:0}: Error finding container badfa03be2d2f1cfbf708f2732c3943530c6738886b7804119f8e1de7eafd4f8: Status 404 returned error can't find the container with id badfa03be2d2f1cfbf708f2732c3943530c6738886b7804119f8e1de7eafd4f8 Apr 23 08:16:52.745020 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:16:52.744997 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5659b6b8f-zltpw"] Apr 23 08:16:52.747686 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:16:52.747657 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb721afd1_8a2d_49cf_9b46_55d368ce0ed8.slice/crio-e97e57999faac13e27111a26ea5a27c41006a120430f1cb08733b0da51f96e14 WatchSource:0}: Error finding container e97e57999faac13e27111a26ea5a27c41006a120430f1cb08733b0da51f96e14: Status 404 returned error can't find the container with id e97e57999faac13e27111a26ea5a27c41006a120430f1cb08733b0da51f96e14 Apr 23 08:16:53.108628 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:16:53.108596 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-pmw99" event={"ID":"bdabc8c7-71d8-4169-b0d7-8304d3f874d8","Type":"ContainerStarted","Data":"badfa03be2d2f1cfbf708f2732c3943530c6738886b7804119f8e1de7eafd4f8"} Apr 23 08:16:53.110075 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:16:53.110044 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-b2s4p" event={"ID":"054d81ed-b5d4-4a4d-b77f-47078e718470","Type":"ContainerStarted","Data":"1bc6198121f40f1a7d4e5afb09ddb9fe3b528b06f2c61d03ac2d43570bddb5b1"} Apr 23 08:16:53.110184 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:16:53.110084 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-b2s4p" event={"ID":"054d81ed-b5d4-4a4d-b77f-47078e718470","Type":"ContainerStarted","Data":"98b023324706f1614c7b0cc36dbd57c252186b2650bdbb6ef9917a4afa050435"} Apr 23 08:16:53.111405 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:16:53.111373 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5659b6b8f-zltpw" event={"ID":"b721afd1-8a2d-49cf-9b46-55d368ce0ed8","Type":"ContainerStarted","Data":"1b1afe721aa24e583c229eea040eac21b3309d1ae82ab0abadd0f27552720acd"} Apr 23 08:16:53.111497 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:16:53.111413 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5659b6b8f-zltpw" event={"ID":"b721afd1-8a2d-49cf-9b46-55d368ce0ed8","Type":"ContainerStarted","Data":"e97e57999faac13e27111a26ea5a27c41006a120430f1cb08733b0da51f96e14"} Apr 23 08:16:53.111555 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:16:53.111516 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-5659b6b8f-zltpw" Apr 23 08:16:53.130364 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:16:53.130184 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-5659b6b8f-zltpw" podStartSLOduration=161.130164297 podStartE2EDuration="2m41.130164297s" podCreationTimestamp="2026-04-23 08:14:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 08:16:53.129343737 +0000 UTC m=+162.077530966" watchObservedRunningTime="2026-04-23 08:16:53.130164297 +0000 UTC m=+162.078351527" Apr 23 08:16:53.811325 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:16:53.811256 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-85d9b8f75b-pzlv2" podUID="aab3e243-be52-4843-9a50-84da5fac7382" containerName="acm-agent" probeResult="failure" output="Get \"http://10.132.0.10:8000/readyz\": dial tcp 10.132.0.10:8000: connect: connection refused" Apr 23 08:16:54.115362 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:16:54.115278 2570 generic.go:358] "Generic (PLEG): container finished" podID="aab3e243-be52-4843-9a50-84da5fac7382" containerID="17dae5ea0c4f79226e5af3bfd5ba1ff6c85d02935a3d8eb4c176d23b78fa230e" exitCode=1 Apr 23 08:16:54.115522 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:16:54.115351 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-85d9b8f75b-pzlv2" event={"ID":"aab3e243-be52-4843-9a50-84da5fac7382","Type":"ContainerDied","Data":"17dae5ea0c4f79226e5af3bfd5ba1ff6c85d02935a3d8eb4c176d23b78fa230e"} Apr 23 08:16:54.115789 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:16:54.115765 2570 scope.go:117] "RemoveContainer" containerID="17dae5ea0c4f79226e5af3bfd5ba1ff6c85d02935a3d8eb4c176d23b78fa230e" Apr 23 08:16:54.116770 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:16:54.116748 2570 generic.go:358] "Generic (PLEG): container finished" podID="76b08de2-6cb8-4320-bedc-cb3ca96031f5" containerID="bd83c3fbc3ecb6d7152e300831fca6b6be7b9bcb8245a2e98d014da699363d50" exitCode=255 Apr 23 08:16:54.116854 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:16:54.116813 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-57cffdbbd5-5knwz" event={"ID":"76b08de2-6cb8-4320-bedc-cb3ca96031f5","Type":"ContainerDied","Data":"bd83c3fbc3ecb6d7152e300831fca6b6be7b9bcb8245a2e98d014da699363d50"} Apr 23 08:16:54.117227 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:16:54.117086 2570 scope.go:117] "RemoveContainer" containerID="bd83c3fbc3ecb6d7152e300831fca6b6be7b9bcb8245a2e98d014da699363d50" Apr 23 08:16:54.118651 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:16:54.118617 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-b2s4p" event={"ID":"054d81ed-b5d4-4a4d-b77f-47078e718470","Type":"ContainerStarted","Data":"aadcf0bf0d32fea1389099c1b3e638df4a0fc01b4c500ef5d51a6311655605f8"} Apr 23 08:16:54.808622 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:16:54.808580 2570 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-57cffdbbd5-5knwz" Apr 23 08:16:54.836246 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:16:54.836212 2570 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-85d9b8f75b-pzlv2" Apr 23 08:16:55.124181 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:16:55.124098 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-57cffdbbd5-5knwz" event={"ID":"76b08de2-6cb8-4320-bedc-cb3ca96031f5","Type":"ContainerStarted","Data":"482b4ab251870fa67dd9642818162cd510088588680e8a69a14974098b4d7fa9"} Apr 23 08:16:55.125611 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:16:55.125579 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-85d9b8f75b-pzlv2" event={"ID":"aab3e243-be52-4843-9a50-84da5fac7382","Type":"ContainerStarted","Data":"c50938891980961b9af043625a40f330b1c70fcfd3a64494ce8f117b3af37f99"} Apr 23 08:16:55.125786 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:16:55.125768 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-85d9b8f75b-pzlv2" Apr 23 08:16:55.126454 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:16:55.126436 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-85d9b8f75b-pzlv2" Apr 23 08:16:56.128887 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:16:56.128847 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-pmw99" event={"ID":"bdabc8c7-71d8-4169-b0d7-8304d3f874d8","Type":"ContainerStarted","Data":"5819d2083a611a786d42e744ea9241caed404d1dddbb5a3844061b46f0a0125c"} Apr 23 08:16:56.130646 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:16:56.130622 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-b2s4p" event={"ID":"054d81ed-b5d4-4a4d-b77f-47078e718470","Type":"ContainerStarted","Data":"5052010e6806dacdef2fd629eeac2d28fb02c044c3361a6fecbad5a006cef9fd"} Apr 23 08:16:56.143091 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:16:56.143047 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-pmw99" podStartSLOduration=129.717971536 podStartE2EDuration="2m12.143034338s" podCreationTimestamp="2026-04-23 08:14:44 +0000 UTC" firstStartedPulling="2026-04-23 08:16:52.731402616 +0000 UTC m=+161.679589823" lastFinishedPulling="2026-04-23 08:16:55.156465416 +0000 UTC m=+164.104652625" observedRunningTime="2026-04-23 08:16:56.14254099 +0000 UTC m=+165.090728258" watchObservedRunningTime="2026-04-23 08:16:56.143034338 +0000 UTC m=+165.091221559" Apr 23 08:16:56.158145 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:16:56.158107 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-b2s4p" podStartSLOduration=9.397002059 podStartE2EDuration="12.158092329s" podCreationTimestamp="2026-04-23 08:16:44 +0000 UTC" firstStartedPulling="2026-04-23 08:16:52.396223332 +0000 UTC m=+161.344410539" lastFinishedPulling="2026-04-23 08:16:55.157313599 +0000 UTC m=+164.105500809" observedRunningTime="2026-04-23 08:16:56.157396435 +0000 UTC m=+165.105583657" watchObservedRunningTime="2026-04-23 08:16:56.158092329 +0000 UTC m=+165.106279560" Apr 23 08:16:58.585718 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:16:58.585672 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-2lvrc" Apr 23 08:16:58.587977 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:16:58.587957 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-f2cmm\"" Apr 23 08:16:58.596584 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:16:58.596558 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-2lvrc" Apr 23 08:16:58.707594 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:16:58.707565 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-2lvrc"] Apr 23 08:16:58.710734 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:16:58.710708 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52a0af2f_2394_4095_ac7f_6cc85c59c3a7.slice/crio-75385c77a511b6756dfdf19cae3b9f6f43bd041ce74e92738ae2c630a892ce38 WatchSource:0}: Error finding container 75385c77a511b6756dfdf19cae3b9f6f43bd041ce74e92738ae2c630a892ce38: Status 404 returned error can't find the container with id 75385c77a511b6756dfdf19cae3b9f6f43bd041ce74e92738ae2c630a892ce38 Apr 23 08:16:59.138843 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:16:59.138799 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-2lvrc" event={"ID":"52a0af2f-2394-4095-ac7f-6cc85c59c3a7","Type":"ContainerStarted","Data":"75385c77a511b6756dfdf19cae3b9f6f43bd041ce74e92738ae2c630a892ce38"} Apr 23 08:17:01.145640 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:17:01.145601 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-2lvrc" event={"ID":"52a0af2f-2394-4095-ac7f-6cc85c59c3a7","Type":"ContainerStarted","Data":"f5714ca7f024327d8e64c6071f43be937fa04f1d1120f7b660d1c9c549d0898b"} Apr 23 08:17:01.145640 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:17:01.145638 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-2lvrc" event={"ID":"52a0af2f-2394-4095-ac7f-6cc85c59c3a7","Type":"ContainerStarted","Data":"4ec77ff51f3835de7376a32e7e72d54ddb8d3cc3759f8c455c41780e59dd254e"} Apr 23 08:17:01.146300 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:17:01.145668 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-2lvrc" Apr 23 08:17:01.162840 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:17:01.162792 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-2lvrc" podStartSLOduration=135.602861359 podStartE2EDuration="2m17.162778664s" podCreationTimestamp="2026-04-23 08:14:44 +0000 UTC" firstStartedPulling="2026-04-23 08:16:58.712458517 +0000 UTC m=+167.660645724" lastFinishedPulling="2026-04-23 08:17:00.272375808 +0000 UTC m=+169.220563029" observedRunningTime="2026-04-23 08:17:01.162358 +0000 UTC m=+170.110545228" watchObservedRunningTime="2026-04-23 08:17:01.162778664 +0000 UTC m=+170.110965894" Apr 23 08:17:01.587992 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:17:01.587960 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4gzjb" Apr 23 08:17:11.150569 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:17:11.150539 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-2lvrc" Apr 23 08:17:12.611718 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:17:12.611683 2570 patch_prober.go:28] interesting pod/image-registry-5659b6b8f-zltpw container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 08:17:12.612080 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:17:12.611746 2570 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-5659b6b8f-zltpw" podUID="b721afd1-8a2d-49cf-9b46-55d368ce0ed8" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 08:17:14.115264 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:17:14.115226 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-5c259"] Apr 23 08:17:14.118274 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:17:14.118253 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-5c259" Apr 23 08:17:14.120132 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:17:14.120109 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 23 08:17:14.120434 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:17:14.120415 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 23 08:17:14.120520 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:17:14.120435 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 23 08:17:14.120574 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:17:14.120437 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-jn6vp\"" Apr 23 08:17:14.120632 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:17:14.120578 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 23 08:17:14.120850 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:17:14.120832 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 23 08:17:14.120909 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:17:14.120878 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 23 08:17:14.123002 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:17:14.122981 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-5659b6b8f-zltpw" Apr 23 08:17:14.209307 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:17:14.209268 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/91edfee8-0edd-47a6-a65f-1cb810bbe9ed-node-exporter-accelerators-collector-config\") pod \"node-exporter-5c259\" (UID: \"91edfee8-0edd-47a6-a65f-1cb810bbe9ed\") " pod="openshift-monitoring/node-exporter-5c259" Apr 23 08:17:14.209459 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:17:14.209323 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/91edfee8-0edd-47a6-a65f-1cb810bbe9ed-metrics-client-ca\") pod \"node-exporter-5c259\" (UID: \"91edfee8-0edd-47a6-a65f-1cb810bbe9ed\") " pod="openshift-monitoring/node-exporter-5c259" Apr 23 08:17:14.209459 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:17:14.209341 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/91edfee8-0edd-47a6-a65f-1cb810bbe9ed-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-5c259\" (UID: \"91edfee8-0edd-47a6-a65f-1cb810bbe9ed\") " pod="openshift-monitoring/node-exporter-5c259" Apr 23 08:17:14.209459 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:17:14.209381 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/91edfee8-0edd-47a6-a65f-1cb810bbe9ed-sys\") pod \"node-exporter-5c259\" (UID: \"91edfee8-0edd-47a6-a65f-1cb810bbe9ed\") " pod="openshift-monitoring/node-exporter-5c259" Apr 23 08:17:14.209459 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:17:14.209409 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/91edfee8-0edd-47a6-a65f-1cb810bbe9ed-node-exporter-tls\") pod \"node-exporter-5c259\" (UID: \"91edfee8-0edd-47a6-a65f-1cb810bbe9ed\") " pod="openshift-monitoring/node-exporter-5c259" Apr 23 08:17:14.209459 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:17:14.209427 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/91edfee8-0edd-47a6-a65f-1cb810bbe9ed-root\") pod \"node-exporter-5c259\" (UID: \"91edfee8-0edd-47a6-a65f-1cb810bbe9ed\") " pod="openshift-monitoring/node-exporter-5c259" Apr 23 08:17:14.209752 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:17:14.209481 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/91edfee8-0edd-47a6-a65f-1cb810bbe9ed-node-exporter-textfile\") pod \"node-exporter-5c259\" (UID: \"91edfee8-0edd-47a6-a65f-1cb810bbe9ed\") " pod="openshift-monitoring/node-exporter-5c259" Apr 23 08:17:14.209752 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:17:14.209517 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/91edfee8-0edd-47a6-a65f-1cb810bbe9ed-node-exporter-wtmp\") pod \"node-exporter-5c259\" (UID: \"91edfee8-0edd-47a6-a65f-1cb810bbe9ed\") " pod="openshift-monitoring/node-exporter-5c259" Apr 23 08:17:14.209752 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:17:14.209642 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qss5\" (UniqueName: \"kubernetes.io/projected/91edfee8-0edd-47a6-a65f-1cb810bbe9ed-kube-api-access-9qss5\") pod \"node-exporter-5c259\" (UID: \"91edfee8-0edd-47a6-a65f-1cb810bbe9ed\") " pod="openshift-monitoring/node-exporter-5c259" Apr 23 08:17:14.310248 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:17:14.310204 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9qss5\" (UniqueName: \"kubernetes.io/projected/91edfee8-0edd-47a6-a65f-1cb810bbe9ed-kube-api-access-9qss5\") pod \"node-exporter-5c259\" (UID: \"91edfee8-0edd-47a6-a65f-1cb810bbe9ed\") " pod="openshift-monitoring/node-exporter-5c259" Apr 23 08:17:14.310450 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:17:14.310258 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/91edfee8-0edd-47a6-a65f-1cb810bbe9ed-node-exporter-accelerators-collector-config\") pod \"node-exporter-5c259\" (UID: \"91edfee8-0edd-47a6-a65f-1cb810bbe9ed\") " pod="openshift-monitoring/node-exporter-5c259" Apr 23 08:17:14.310450 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:17:14.310282 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/91edfee8-0edd-47a6-a65f-1cb810bbe9ed-metrics-client-ca\") pod \"node-exporter-5c259\" (UID: \"91edfee8-0edd-47a6-a65f-1cb810bbe9ed\") " pod="openshift-monitoring/node-exporter-5c259" Apr 23 08:17:14.310450 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:17:14.310297 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/91edfee8-0edd-47a6-a65f-1cb810bbe9ed-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-5c259\" (UID: \"91edfee8-0edd-47a6-a65f-1cb810bbe9ed\") " pod="openshift-monitoring/node-exporter-5c259" Apr 23 08:17:14.310450 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:17:14.310321 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/91edfee8-0edd-47a6-a65f-1cb810bbe9ed-sys\") pod \"node-exporter-5c259\" (UID: \"91edfee8-0edd-47a6-a65f-1cb810bbe9ed\") " pod="openshift-monitoring/node-exporter-5c259" Apr 23 08:17:14.310450 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:17:14.310339 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/91edfee8-0edd-47a6-a65f-1cb810bbe9ed-node-exporter-tls\") pod \"node-exporter-5c259\" (UID: \"91edfee8-0edd-47a6-a65f-1cb810bbe9ed\") " pod="openshift-monitoring/node-exporter-5c259" Apr 23 08:17:14.310450 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:17:14.310363 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/91edfee8-0edd-47a6-a65f-1cb810bbe9ed-root\") pod \"node-exporter-5c259\" (UID: \"91edfee8-0edd-47a6-a65f-1cb810bbe9ed\") " pod="openshift-monitoring/node-exporter-5c259" Apr 23 08:17:14.310450 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:17:14.310389 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/91edfee8-0edd-47a6-a65f-1cb810bbe9ed-node-exporter-textfile\") pod \"node-exporter-5c259\" (UID: \"91edfee8-0edd-47a6-a65f-1cb810bbe9ed\") " pod="openshift-monitoring/node-exporter-5c259" Apr 23 08:17:14.310450 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:17:14.310413 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/91edfee8-0edd-47a6-a65f-1cb810bbe9ed-node-exporter-wtmp\") pod \"node-exporter-5c259\" (UID: \"91edfee8-0edd-47a6-a65f-1cb810bbe9ed\") " pod="openshift-monitoring/node-exporter-5c259" Apr 23 08:17:14.310450 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:17:14.310429 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/91edfee8-0edd-47a6-a65f-1cb810bbe9ed-sys\") pod \"node-exporter-5c259\" (UID: \"91edfee8-0edd-47a6-a65f-1cb810bbe9ed\") " pod="openshift-monitoring/node-exporter-5c259" Apr 23 08:17:14.310450 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:17:14.310447 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/91edfee8-0edd-47a6-a65f-1cb810bbe9ed-root\") pod \"node-exporter-5c259\" (UID: \"91edfee8-0edd-47a6-a65f-1cb810bbe9ed\") " pod="openshift-monitoring/node-exporter-5c259" Apr 23 08:17:14.310801 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:17:14.310621 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/91edfee8-0edd-47a6-a65f-1cb810bbe9ed-node-exporter-wtmp\") pod \"node-exporter-5c259\" (UID: \"91edfee8-0edd-47a6-a65f-1cb810bbe9ed\") " pod="openshift-monitoring/node-exporter-5c259" Apr 23 08:17:14.310801 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:17:14.310780 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/91edfee8-0edd-47a6-a65f-1cb810bbe9ed-node-exporter-textfile\") pod \"node-exporter-5c259\" (UID: \"91edfee8-0edd-47a6-a65f-1cb810bbe9ed\") " pod="openshift-monitoring/node-exporter-5c259" Apr 23 08:17:14.311016 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:17:14.310995 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/91edfee8-0edd-47a6-a65f-1cb810bbe9ed-node-exporter-accelerators-collector-config\") pod \"node-exporter-5c259\" (UID: \"91edfee8-0edd-47a6-a65f-1cb810bbe9ed\") " pod="openshift-monitoring/node-exporter-5c259" Apr 23 08:17:14.311087 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:17:14.311017 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/91edfee8-0edd-47a6-a65f-1cb810bbe9ed-metrics-client-ca\") pod \"node-exporter-5c259\" (UID: \"91edfee8-0edd-47a6-a65f-1cb810bbe9ed\") " pod="openshift-monitoring/node-exporter-5c259" Apr 23 08:17:14.312960 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:17:14.312934 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/91edfee8-0edd-47a6-a65f-1cb810bbe9ed-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-5c259\" (UID: \"91edfee8-0edd-47a6-a65f-1cb810bbe9ed\") " pod="openshift-monitoring/node-exporter-5c259" Apr 23 08:17:14.313053 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:17:14.312934 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/91edfee8-0edd-47a6-a65f-1cb810bbe9ed-node-exporter-tls\") pod \"node-exporter-5c259\" (UID: \"91edfee8-0edd-47a6-a65f-1cb810bbe9ed\") " pod="openshift-monitoring/node-exporter-5c259" Apr 23 08:17:14.317366 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:17:14.317348 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qss5\" (UniqueName: \"kubernetes.io/projected/91edfee8-0edd-47a6-a65f-1cb810bbe9ed-kube-api-access-9qss5\") pod \"node-exporter-5c259\" (UID: \"91edfee8-0edd-47a6-a65f-1cb810bbe9ed\") " pod="openshift-monitoring/node-exporter-5c259" Apr 23 08:17:14.427764 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:17:14.427673 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-5c259" Apr 23 08:17:14.437667 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:17:14.437633 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod91edfee8_0edd_47a6_a65f_1cb810bbe9ed.slice/crio-ff507bc84ce59a5aaf8b5539d2cc79c1163e18df6a99d9ae6ab81a1d7c9aee76 WatchSource:0}: Error finding container ff507bc84ce59a5aaf8b5539d2cc79c1163e18df6a99d9ae6ab81a1d7c9aee76: Status 404 returned error can't find the container with id ff507bc84ce59a5aaf8b5539d2cc79c1163e18df6a99d9ae6ab81a1d7c9aee76 Apr 23 08:17:15.183424 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:17:15.183392 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-5c259" event={"ID":"91edfee8-0edd-47a6-a65f-1cb810bbe9ed","Type":"ContainerStarted","Data":"30e66edc047657410740c86bde579d95d61c2bd968e2bb4c4871c837a21f5721"} Apr 23 08:17:15.183860 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:17:15.183432 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-5c259" event={"ID":"91edfee8-0edd-47a6-a65f-1cb810bbe9ed","Type":"ContainerStarted","Data":"ff507bc84ce59a5aaf8b5539d2cc79c1163e18df6a99d9ae6ab81a1d7c9aee76"} Apr 23 08:17:16.187421 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:17:16.187392 2570 generic.go:358] "Generic (PLEG): container finished" podID="91edfee8-0edd-47a6-a65f-1cb810bbe9ed" containerID="30e66edc047657410740c86bde579d95d61c2bd968e2bb4c4871c837a21f5721" exitCode=0 Apr 23 08:17:16.187787 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:17:16.187428 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-5c259" event={"ID":"91edfee8-0edd-47a6-a65f-1cb810bbe9ed","Type":"ContainerDied","Data":"30e66edc047657410740c86bde579d95d61c2bd968e2bb4c4871c837a21f5721"} Apr 23 08:17:17.192174 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:17:17.192134 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-5c259" event={"ID":"91edfee8-0edd-47a6-a65f-1cb810bbe9ed","Type":"ContainerStarted","Data":"1954271cf86bee4bfc9f90119c6498097cfafb8af0f86f9aa7b1bffc593047a9"} Apr 23 08:17:17.192174 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:17:17.192175 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-5c259" event={"ID":"91edfee8-0edd-47a6-a65f-1cb810bbe9ed","Type":"ContainerStarted","Data":"b81376ddf14085bae45a4f5fcfdc1894c80b1c13e294d438d00725ac08d7b4fc"} Apr 23 08:17:17.210249 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:17:17.210177 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-5c259" podStartSLOduration=2.555558145 podStartE2EDuration="3.210159898s" podCreationTimestamp="2026-04-23 08:17:14 +0000 UTC" firstStartedPulling="2026-04-23 08:17:14.439884835 +0000 UTC m=+183.388072047" lastFinishedPulling="2026-04-23 08:17:15.094486593 +0000 UTC m=+184.042673800" observedRunningTime="2026-04-23 08:17:17.210091511 +0000 UTC m=+186.158278744" watchObservedRunningTime="2026-04-23 08:17:17.210159898 +0000 UTC m=+186.158347133" Apr 23 08:17:24.818452 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:17:24.818412 2570 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b8989dcd8-6k676" podUID="f8e3f488-9aff-4f7a-a300-a03290cbf7ce" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 23 08:17:27.276335 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:17:27.276303 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5659b6b8f-zltpw"] Apr 23 08:17:34.819099 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:17:34.819056 2570 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b8989dcd8-6k676" podUID="f8e3f488-9aff-4f7a-a300-a03290cbf7ce" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 23 08:17:44.818597 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:17:44.818558 2570 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b8989dcd8-6k676" podUID="f8e3f488-9aff-4f7a-a300-a03290cbf7ce" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 23 08:17:44.818968 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:17:44.818629 2570 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b8989dcd8-6k676" Apr 23 08:17:44.819106 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:17:44.819074 2570 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"a01146a026c89aa9ad5572fc87cc31c10061ff4c40dbda35b432205ecaeab247"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b8989dcd8-6k676" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 23 08:17:44.819183 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:17:44.819126 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b8989dcd8-6k676" podUID="f8e3f488-9aff-4f7a-a300-a03290cbf7ce" containerName="service-proxy" containerID="cri-o://a01146a026c89aa9ad5572fc87cc31c10061ff4c40dbda35b432205ecaeab247" gracePeriod=30 Apr 23 08:17:45.267340 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:17:45.267308 2570 generic.go:358] "Generic (PLEG): container finished" podID="f8e3f488-9aff-4f7a-a300-a03290cbf7ce" containerID="a01146a026c89aa9ad5572fc87cc31c10061ff4c40dbda35b432205ecaeab247" exitCode=2 Apr 23 08:17:45.267509 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:17:45.267383 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b8989dcd8-6k676" event={"ID":"f8e3f488-9aff-4f7a-a300-a03290cbf7ce","Type":"ContainerDied","Data":"a01146a026c89aa9ad5572fc87cc31c10061ff4c40dbda35b432205ecaeab247"} Apr 23 08:17:45.267509 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:17:45.267423 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b8989dcd8-6k676" event={"ID":"f8e3f488-9aff-4f7a-a300-a03290cbf7ce","Type":"ContainerStarted","Data":"24fce65a16265c3d2e7b9c097f2a0483b20be9d5d619c3629d3782d3799c7e5a"} Apr 23 08:17:52.299759 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:17:52.299693 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-5659b6b8f-zltpw" podUID="b721afd1-8a2d-49cf-9b46-55d368ce0ed8" containerName="registry" containerID="cri-o://1b1afe721aa24e583c229eea040eac21b3309d1ae82ab0abadd0f27552720acd" gracePeriod=30 Apr 23 08:17:52.547033 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:17:52.547010 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5659b6b8f-zltpw" Apr 23 08:17:52.728346 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:17:52.728314 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b721afd1-8a2d-49cf-9b46-55d368ce0ed8-registry-tls\") pod \"b721afd1-8a2d-49cf-9b46-55d368ce0ed8\" (UID: \"b721afd1-8a2d-49cf-9b46-55d368ce0ed8\") " Apr 23 08:17:52.728515 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:17:52.728373 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c299t\" (UniqueName: \"kubernetes.io/projected/b721afd1-8a2d-49cf-9b46-55d368ce0ed8-kube-api-access-c299t\") pod \"b721afd1-8a2d-49cf-9b46-55d368ce0ed8\" (UID: \"b721afd1-8a2d-49cf-9b46-55d368ce0ed8\") " Apr 23 08:17:52.728515 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:17:52.728397 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b721afd1-8a2d-49cf-9b46-55d368ce0ed8-image-registry-private-configuration\") pod \"b721afd1-8a2d-49cf-9b46-55d368ce0ed8\" (UID: \"b721afd1-8a2d-49cf-9b46-55d368ce0ed8\") " Apr 23 08:17:52.728515 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:17:52.728432 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b721afd1-8a2d-49cf-9b46-55d368ce0ed8-installation-pull-secrets\") pod \"b721afd1-8a2d-49cf-9b46-55d368ce0ed8\" (UID: \"b721afd1-8a2d-49cf-9b46-55d368ce0ed8\") " Apr 23 08:17:52.728515 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:17:52.728453 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b721afd1-8a2d-49cf-9b46-55d368ce0ed8-ca-trust-extracted\") pod \"b721afd1-8a2d-49cf-9b46-55d368ce0ed8\" (UID: \"b721afd1-8a2d-49cf-9b46-55d368ce0ed8\") " Apr 23 08:17:52.728515 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:17:52.728477 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b721afd1-8a2d-49cf-9b46-55d368ce0ed8-trusted-ca\") pod \"b721afd1-8a2d-49cf-9b46-55d368ce0ed8\" (UID: \"b721afd1-8a2d-49cf-9b46-55d368ce0ed8\") " Apr 23 08:17:52.728767 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:17:52.728516 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b721afd1-8a2d-49cf-9b46-55d368ce0ed8-registry-certificates\") pod \"b721afd1-8a2d-49cf-9b46-55d368ce0ed8\" (UID: \"b721afd1-8a2d-49cf-9b46-55d368ce0ed8\") " Apr 23 08:17:52.728767 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:17:52.728553 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b721afd1-8a2d-49cf-9b46-55d368ce0ed8-bound-sa-token\") pod \"b721afd1-8a2d-49cf-9b46-55d368ce0ed8\" (UID: \"b721afd1-8a2d-49cf-9b46-55d368ce0ed8\") " Apr 23 08:17:52.728983 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:17:52.728950 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b721afd1-8a2d-49cf-9b46-55d368ce0ed8-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "b721afd1-8a2d-49cf-9b46-55d368ce0ed8" (UID: "b721afd1-8a2d-49cf-9b46-55d368ce0ed8"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 08:17:52.729155 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:17:52.729112 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b721afd1-8a2d-49cf-9b46-55d368ce0ed8-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "b721afd1-8a2d-49cf-9b46-55d368ce0ed8" (UID: "b721afd1-8a2d-49cf-9b46-55d368ce0ed8"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 08:17:52.730997 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:17:52.730960 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b721afd1-8a2d-49cf-9b46-55d368ce0ed8-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "b721afd1-8a2d-49cf-9b46-55d368ce0ed8" (UID: "b721afd1-8a2d-49cf-9b46-55d368ce0ed8"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 08:17:52.731177 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:17:52.731156 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b721afd1-8a2d-49cf-9b46-55d368ce0ed8-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "b721afd1-8a2d-49cf-9b46-55d368ce0ed8" (UID: "b721afd1-8a2d-49cf-9b46-55d368ce0ed8"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 08:17:52.731313 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:17:52.731289 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b721afd1-8a2d-49cf-9b46-55d368ce0ed8-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "b721afd1-8a2d-49cf-9b46-55d368ce0ed8" (UID: "b721afd1-8a2d-49cf-9b46-55d368ce0ed8"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 08:17:52.731382 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:17:52.731364 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b721afd1-8a2d-49cf-9b46-55d368ce0ed8-kube-api-access-c299t" (OuterVolumeSpecName: "kube-api-access-c299t") pod "b721afd1-8a2d-49cf-9b46-55d368ce0ed8" (UID: "b721afd1-8a2d-49cf-9b46-55d368ce0ed8"). InnerVolumeSpecName "kube-api-access-c299t". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 08:17:52.731438 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:17:52.731392 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b721afd1-8a2d-49cf-9b46-55d368ce0ed8-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "b721afd1-8a2d-49cf-9b46-55d368ce0ed8" (UID: "b721afd1-8a2d-49cf-9b46-55d368ce0ed8"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 08:17:52.737317 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:17:52.737288 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b721afd1-8a2d-49cf-9b46-55d368ce0ed8-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "b721afd1-8a2d-49cf-9b46-55d368ce0ed8" (UID: "b721afd1-8a2d-49cf-9b46-55d368ce0ed8"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 08:17:52.829448 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:17:52.829406 2570 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b721afd1-8a2d-49cf-9b46-55d368ce0ed8-registry-certificates\") on node \"ip-10-0-135-129.ec2.internal\" DevicePath \"\"" Apr 23 08:17:52.829448 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:17:52.829442 2570 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b721afd1-8a2d-49cf-9b46-55d368ce0ed8-bound-sa-token\") on node \"ip-10-0-135-129.ec2.internal\" DevicePath \"\"" Apr 23 08:17:52.829448 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:17:52.829455 2570 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b721afd1-8a2d-49cf-9b46-55d368ce0ed8-registry-tls\") on node \"ip-10-0-135-129.ec2.internal\" DevicePath \"\"" Apr 23 08:17:52.829676 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:17:52.829466 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-c299t\" (UniqueName: \"kubernetes.io/projected/b721afd1-8a2d-49cf-9b46-55d368ce0ed8-kube-api-access-c299t\") on node \"ip-10-0-135-129.ec2.internal\" DevicePath \"\"" Apr 23 08:17:52.829676 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:17:52.829480 2570 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b721afd1-8a2d-49cf-9b46-55d368ce0ed8-image-registry-private-configuration\") on node \"ip-10-0-135-129.ec2.internal\" DevicePath \"\"" Apr 23 08:17:52.829676 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:17:52.829494 2570 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b721afd1-8a2d-49cf-9b46-55d368ce0ed8-installation-pull-secrets\") on node \"ip-10-0-135-129.ec2.internal\" DevicePath \"\"" Apr 23 08:17:52.829676 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:17:52.829507 2570 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b721afd1-8a2d-49cf-9b46-55d368ce0ed8-ca-trust-extracted\") on node \"ip-10-0-135-129.ec2.internal\" DevicePath \"\"" Apr 23 08:17:52.829676 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:17:52.829518 2570 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b721afd1-8a2d-49cf-9b46-55d368ce0ed8-trusted-ca\") on node \"ip-10-0-135-129.ec2.internal\" DevicePath \"\"" Apr 23 08:17:53.291128 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:17:53.291090 2570 generic.go:358] "Generic (PLEG): container finished" podID="b721afd1-8a2d-49cf-9b46-55d368ce0ed8" containerID="1b1afe721aa24e583c229eea040eac21b3309d1ae82ab0abadd0f27552720acd" exitCode=0 Apr 23 08:17:53.291316 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:17:53.291155 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5659b6b8f-zltpw" Apr 23 08:17:53.291316 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:17:53.291154 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5659b6b8f-zltpw" event={"ID":"b721afd1-8a2d-49cf-9b46-55d368ce0ed8","Type":"ContainerDied","Data":"1b1afe721aa24e583c229eea040eac21b3309d1ae82ab0abadd0f27552720acd"} Apr 23 08:17:53.291316 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:17:53.291258 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5659b6b8f-zltpw" event={"ID":"b721afd1-8a2d-49cf-9b46-55d368ce0ed8","Type":"ContainerDied","Data":"e97e57999faac13e27111a26ea5a27c41006a120430f1cb08733b0da51f96e14"} Apr 23 08:17:53.291316 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:17:53.291281 2570 scope.go:117] "RemoveContainer" containerID="1b1afe721aa24e583c229eea040eac21b3309d1ae82ab0abadd0f27552720acd" Apr 23 08:17:53.299720 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:17:53.299702 2570 scope.go:117] "RemoveContainer" containerID="1b1afe721aa24e583c229eea040eac21b3309d1ae82ab0abadd0f27552720acd" Apr 23 08:17:53.299979 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:17:53.299960 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b1afe721aa24e583c229eea040eac21b3309d1ae82ab0abadd0f27552720acd\": container with ID starting with 1b1afe721aa24e583c229eea040eac21b3309d1ae82ab0abadd0f27552720acd not found: ID does not exist" containerID="1b1afe721aa24e583c229eea040eac21b3309d1ae82ab0abadd0f27552720acd" Apr 23 08:17:53.300227 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:17:53.299988 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b1afe721aa24e583c229eea040eac21b3309d1ae82ab0abadd0f27552720acd"} err="failed to get container status \"1b1afe721aa24e583c229eea040eac21b3309d1ae82ab0abadd0f27552720acd\": rpc error: code = NotFound desc = could not find container \"1b1afe721aa24e583c229eea040eac21b3309d1ae82ab0abadd0f27552720acd\": container with ID starting with 1b1afe721aa24e583c229eea040eac21b3309d1ae82ab0abadd0f27552720acd not found: ID does not exist" Apr 23 08:17:53.309691 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:17:53.309670 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5659b6b8f-zltpw"] Apr 23 08:17:53.315628 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:17:53.315609 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-5659b6b8f-zltpw"] Apr 23 08:17:53.589833 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:17:53.589740 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b721afd1-8a2d-49cf-9b46-55d368ce0ed8" path="/var/lib/kubelet/pods/b721afd1-8a2d-49cf-9b46-55d368ce0ed8/volumes" Apr 23 08:18:23.447375 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:18:23.447334 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5e2ddc62-96a8-4827-87cc-a0b37c9a5e9f-metrics-certs\") pod \"network-metrics-daemon-4gzjb\" (UID: \"5e2ddc62-96a8-4827-87cc-a0b37c9a5e9f\") " pod="openshift-multus/network-metrics-daemon-4gzjb" Apr 23 08:18:23.449647 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:18:23.449622 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5e2ddc62-96a8-4827-87cc-a0b37c9a5e9f-metrics-certs\") pod \"network-metrics-daemon-4gzjb\" (UID: \"5e2ddc62-96a8-4827-87cc-a0b37c9a5e9f\") " pod="openshift-multus/network-metrics-daemon-4gzjb" Apr 23 08:18:23.491224 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:18:23.491170 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-vjj4z\"" Apr 23 08:18:23.499801 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:18:23.499773 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4gzjb" Apr 23 08:18:23.614361 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:18:23.614331 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-4gzjb"] Apr 23 08:18:23.617866 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:18:23.617828 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e2ddc62_96a8_4827_87cc_a0b37c9a5e9f.slice/crio-ba2e4a8cd51d753c9b8a4de63c9f9b824c6e84728ca561107011036a6c88938c WatchSource:0}: Error finding container ba2e4a8cd51d753c9b8a4de63c9f9b824c6e84728ca561107011036a6c88938c: Status 404 returned error can't find the container with id ba2e4a8cd51d753c9b8a4de63c9f9b824c6e84728ca561107011036a6c88938c Apr 23 08:18:24.373604 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:18:24.373572 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4gzjb" event={"ID":"5e2ddc62-96a8-4827-87cc-a0b37c9a5e9f","Type":"ContainerStarted","Data":"ba2e4a8cd51d753c9b8a4de63c9f9b824c6e84728ca561107011036a6c88938c"} Apr 23 08:18:25.379971 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:18:25.379928 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4gzjb" event={"ID":"5e2ddc62-96a8-4827-87cc-a0b37c9a5e9f","Type":"ContainerStarted","Data":"25f62d5ed0b5b69fff5e81b652b706b4958f49ecc0f11e832f797d18087dbd6c"} Apr 23 08:18:25.379971 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:18:25.379975 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4gzjb" event={"ID":"5e2ddc62-96a8-4827-87cc-a0b37c9a5e9f","Type":"ContainerStarted","Data":"6d957b9b110e138b2ee7b39f23a5934e7943333ebe54c7a36463db5a55ee8807"} Apr 23 08:18:25.394877 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:18:25.394828 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-4gzjb" podStartSLOduration=253.387605669 podStartE2EDuration="4m14.394814973s" podCreationTimestamp="2026-04-23 08:14:11 +0000 UTC" firstStartedPulling="2026-04-23 08:18:23.619701792 +0000 UTC m=+252.567888999" lastFinishedPulling="2026-04-23 08:18:24.626911093 +0000 UTC m=+253.575098303" observedRunningTime="2026-04-23 08:18:25.394038301 +0000 UTC m=+254.342225543" watchObservedRunningTime="2026-04-23 08:18:25.394814973 +0000 UTC m=+254.343002201" Apr 23 08:18:51.096335 ip-10-0-135-129 kubenswrapper[2570]: E0423 08:18:51.096295 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-cb95c66f6-9zlnz" podUID="ee86940d-8ae8-4a41-80ec-b0743181280c" Apr 23 08:18:51.446488 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:18:51.446406 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-9zlnz" Apr 23 08:18:54.476544 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:18:54.476507 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ee86940d-8ae8-4a41-80ec-b0743181280c-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-9zlnz\" (UID: \"ee86940d-8ae8-4a41-80ec-b0743181280c\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-9zlnz" Apr 23 08:18:54.478998 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:18:54.478970 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ee86940d-8ae8-4a41-80ec-b0743181280c-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-9zlnz\" (UID: \"ee86940d-8ae8-4a41-80ec-b0743181280c\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-9zlnz" Apr 23 08:18:54.749380 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:18:54.749287 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-gvqnd\"" Apr 23 08:18:54.757856 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:18:54.757834 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-9zlnz" Apr 23 08:18:54.877216 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:18:54.877173 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-9zlnz"] Apr 23 08:18:54.880392 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:18:54.880362 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee86940d_8ae8_4a41_80ec_b0743181280c.slice/crio-1a4925b05bd7af9128014717d018ec6edd7a3fd6773992b769221dee382f79dc WatchSource:0}: Error finding container 1a4925b05bd7af9128014717d018ec6edd7a3fd6773992b769221dee382f79dc: Status 404 returned error can't find the container with id 1a4925b05bd7af9128014717d018ec6edd7a3fd6773992b769221dee382f79dc Apr 23 08:18:55.458550 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:18:55.458514 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-9zlnz" event={"ID":"ee86940d-8ae8-4a41-80ec-b0743181280c","Type":"ContainerStarted","Data":"1a4925b05bd7af9128014717d018ec6edd7a3fd6773992b769221dee382f79dc"} Apr 23 08:18:56.462947 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:18:56.462919 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-9zlnz" event={"ID":"ee86940d-8ae8-4a41-80ec-b0743181280c","Type":"ContainerStarted","Data":"53f51443730a75644af30fb10ddc4d3eb29947aafe758e335f7b1c56e87f74f4"} Apr 23 08:18:56.478405 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:18:56.478357 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-9zlnz" podStartSLOduration=268.982659498 podStartE2EDuration="4m30.478341977s" podCreationTimestamp="2026-04-23 08:14:26 +0000 UTC" firstStartedPulling="2026-04-23 08:18:54.882384211 +0000 UTC m=+283.830571417" lastFinishedPulling="2026-04-23 08:18:56.378066674 +0000 UTC m=+285.326253896" observedRunningTime="2026-04-23 08:18:56.477376847 +0000 UTC m=+285.425564088" watchObservedRunningTime="2026-04-23 08:18:56.478341977 +0000 UTC m=+285.426529205" Apr 23 08:19:11.506297 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:19:11.506266 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsvjc_ffbe5334-bba8-45bc-bd64-2141ea3f49a8/ovn-acl-logging/0.log" Apr 23 08:19:11.506727 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:19:11.506384 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsvjc_ffbe5334-bba8-45bc-bd64-2141ea3f49a8/ovn-acl-logging/0.log" Apr 23 08:19:11.509169 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:19:11.509146 2570 kubelet.go:1628] "Image garbage collection succeeded" Apr 23 08:20:26.386873 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:20:26.386790 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-79c8d999ff-hsq9z"] Apr 23 08:20:26.387363 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:20:26.387063 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b721afd1-8a2d-49cf-9b46-55d368ce0ed8" containerName="registry" Apr 23 08:20:26.387363 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:20:26.387076 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="b721afd1-8a2d-49cf-9b46-55d368ce0ed8" containerName="registry" Apr 23 08:20:26.387363 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:20:26.387121 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="b721afd1-8a2d-49cf-9b46-55d368ce0ed8" containerName="registry" Apr 23 08:20:26.389806 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:20:26.389789 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-hsq9z" Apr 23 08:20:26.391624 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:20:26.391603 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 23 08:20:26.391737 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:20:26.391690 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-f4qld\"" Apr 23 08:20:26.392247 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:20:26.392225 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 23 08:20:26.399121 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:20:26.399101 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-hsq9z"] Apr 23 08:20:26.495556 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:20:26.495513 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2c807e76-5e24-4194-97d3-344b1f256c63-bound-sa-token\") pod \"cert-manager-79c8d999ff-hsq9z\" (UID: \"2c807e76-5e24-4194-97d3-344b1f256c63\") " pod="cert-manager/cert-manager-79c8d999ff-hsq9z" Apr 23 08:20:26.495556 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:20:26.495553 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7x2r\" (UniqueName: \"kubernetes.io/projected/2c807e76-5e24-4194-97d3-344b1f256c63-kube-api-access-f7x2r\") pod \"cert-manager-79c8d999ff-hsq9z\" (UID: \"2c807e76-5e24-4194-97d3-344b1f256c63\") " pod="cert-manager/cert-manager-79c8d999ff-hsq9z" Apr 23 08:20:26.596901 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:20:26.596861 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2c807e76-5e24-4194-97d3-344b1f256c63-bound-sa-token\") pod \"cert-manager-79c8d999ff-hsq9z\" (UID: \"2c807e76-5e24-4194-97d3-344b1f256c63\") " pod="cert-manager/cert-manager-79c8d999ff-hsq9z" Apr 23 08:20:26.596901 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:20:26.596902 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f7x2r\" (UniqueName: \"kubernetes.io/projected/2c807e76-5e24-4194-97d3-344b1f256c63-kube-api-access-f7x2r\") pod \"cert-manager-79c8d999ff-hsq9z\" (UID: \"2c807e76-5e24-4194-97d3-344b1f256c63\") " pod="cert-manager/cert-manager-79c8d999ff-hsq9z" Apr 23 08:20:26.606600 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:20:26.606572 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7x2r\" (UniqueName: \"kubernetes.io/projected/2c807e76-5e24-4194-97d3-344b1f256c63-kube-api-access-f7x2r\") pod \"cert-manager-79c8d999ff-hsq9z\" (UID: \"2c807e76-5e24-4194-97d3-344b1f256c63\") " pod="cert-manager/cert-manager-79c8d999ff-hsq9z" Apr 23 08:20:26.608397 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:20:26.608376 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2c807e76-5e24-4194-97d3-344b1f256c63-bound-sa-token\") pod \"cert-manager-79c8d999ff-hsq9z\" (UID: \"2c807e76-5e24-4194-97d3-344b1f256c63\") " pod="cert-manager/cert-manager-79c8d999ff-hsq9z" Apr 23 08:20:26.698709 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:20:26.698628 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-hsq9z" Apr 23 08:20:26.809373 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:20:26.809344 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-hsq9z"] Apr 23 08:20:26.812614 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:20:26.812587 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2c807e76_5e24_4194_97d3_344b1f256c63.slice/crio-d6c3019e739cfe23d96110aa1b27328a452f4af061ff886e13b864556af84c43 WatchSource:0}: Error finding container d6c3019e739cfe23d96110aa1b27328a452f4af061ff886e13b864556af84c43: Status 404 returned error can't find the container with id d6c3019e739cfe23d96110aa1b27328a452f4af061ff886e13b864556af84c43 Apr 23 08:20:26.814387 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:20:26.814366 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 08:20:27.695799 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:20:27.695743 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-hsq9z" event={"ID":"2c807e76-5e24-4194-97d3-344b1f256c63","Type":"ContainerStarted","Data":"d6c3019e739cfe23d96110aa1b27328a452f4af061ff886e13b864556af84c43"} Apr 23 08:20:29.702589 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:20:29.702550 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-hsq9z" event={"ID":"2c807e76-5e24-4194-97d3-344b1f256c63","Type":"ContainerStarted","Data":"efe5c24bbe7d5cb49d8c728ad3893f61f4c248aec4bac6985b805aeebd49f4c3"} Apr 23 08:20:29.718038 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:20:29.717984 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-79c8d999ff-hsq9z" podStartSLOduration=1.01043233 podStartE2EDuration="3.717969254s" podCreationTimestamp="2026-04-23 08:20:26 +0000 UTC" firstStartedPulling="2026-04-23 08:20:26.814490148 +0000 UTC m=+375.762677356" lastFinishedPulling="2026-04-23 08:20:29.522027073 +0000 UTC m=+378.470214280" observedRunningTime="2026-04-23 08:20:29.717893631 +0000 UTC m=+378.666080860" watchObservedRunningTime="2026-04-23 08:20:29.717969254 +0000 UTC m=+378.666156484" Apr 23 08:20:46.613669 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:20:46.613636 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-jobset-operator/jobset-operator-747c5859c7-g6ptd"] Apr 23 08:20:46.616694 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:20:46.616678 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-jobset-operator/jobset-operator-747c5859c7-g6ptd" Apr 23 08:20:46.618766 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:20:46.618748 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-jobset-operator\"/\"openshift-service-ca.crt\"" Apr 23 08:20:46.619374 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:20:46.619355 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-jobset-operator\"/\"kube-root-ca.crt\"" Apr 23 08:20:46.619468 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:20:46.619361 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-jobset-operator\"/\"jobset-operator-dockercfg-p6nhj\"" Apr 23 08:20:46.626102 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:20:46.626076 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-jobset-operator/jobset-operator-747c5859c7-g6ptd"] Apr 23 08:20:46.736213 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:20:46.736171 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2a8fe58b-a624-41ac-9cf4-b9dba545dd91-tmp\") pod \"jobset-operator-747c5859c7-g6ptd\" (UID: \"2a8fe58b-a624-41ac-9cf4-b9dba545dd91\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-g6ptd" Apr 23 08:20:46.736406 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:20:46.736305 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gr8kr\" (UniqueName: \"kubernetes.io/projected/2a8fe58b-a624-41ac-9cf4-b9dba545dd91-kube-api-access-gr8kr\") pod \"jobset-operator-747c5859c7-g6ptd\" (UID: \"2a8fe58b-a624-41ac-9cf4-b9dba545dd91\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-g6ptd" Apr 23 08:20:46.836909 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:20:46.836868 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2a8fe58b-a624-41ac-9cf4-b9dba545dd91-tmp\") pod \"jobset-operator-747c5859c7-g6ptd\" (UID: \"2a8fe58b-a624-41ac-9cf4-b9dba545dd91\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-g6ptd" Apr 23 08:20:46.837064 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:20:46.836934 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gr8kr\" (UniqueName: \"kubernetes.io/projected/2a8fe58b-a624-41ac-9cf4-b9dba545dd91-kube-api-access-gr8kr\") pod \"jobset-operator-747c5859c7-g6ptd\" (UID: \"2a8fe58b-a624-41ac-9cf4-b9dba545dd91\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-g6ptd" Apr 23 08:20:46.837321 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:20:46.837300 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2a8fe58b-a624-41ac-9cf4-b9dba545dd91-tmp\") pod \"jobset-operator-747c5859c7-g6ptd\" (UID: \"2a8fe58b-a624-41ac-9cf4-b9dba545dd91\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-g6ptd" Apr 23 08:20:46.845336 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:20:46.845310 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gr8kr\" (UniqueName: \"kubernetes.io/projected/2a8fe58b-a624-41ac-9cf4-b9dba545dd91-kube-api-access-gr8kr\") pod \"jobset-operator-747c5859c7-g6ptd\" (UID: \"2a8fe58b-a624-41ac-9cf4-b9dba545dd91\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-g6ptd" Apr 23 08:20:46.925492 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:20:46.925405 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-jobset-operator/jobset-operator-747c5859c7-g6ptd" Apr 23 08:20:47.037921 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:20:47.037875 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-jobset-operator/jobset-operator-747c5859c7-g6ptd"] Apr 23 08:20:47.041717 ip-10-0-135-129 kubenswrapper[2570]: W0423 08:20:47.041688 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2a8fe58b_a624_41ac_9cf4_b9dba545dd91.slice/crio-ac6b2b99a916fba0d9978f8e6e9be05bd5c5236ed6fe1c4c9fd7b4563d1126aa WatchSource:0}: Error finding container ac6b2b99a916fba0d9978f8e6e9be05bd5c5236ed6fe1c4c9fd7b4563d1126aa: Status 404 returned error can't find the container with id ac6b2b99a916fba0d9978f8e6e9be05bd5c5236ed6fe1c4c9fd7b4563d1126aa Apr 23 08:20:47.752303 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:20:47.752267 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-jobset-operator/jobset-operator-747c5859c7-g6ptd" event={"ID":"2a8fe58b-a624-41ac-9cf4-b9dba545dd91","Type":"ContainerStarted","Data":"ac6b2b99a916fba0d9978f8e6e9be05bd5c5236ed6fe1c4c9fd7b4563d1126aa"} Apr 23 08:20:49.759856 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:20:49.759823 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-jobset-operator/jobset-operator-747c5859c7-g6ptd" event={"ID":"2a8fe58b-a624-41ac-9cf4-b9dba545dd91","Type":"ContainerStarted","Data":"ed556e227d702b57da4131d12e1e63993b7e3ce2c74664684e109a7e9bc44379"} Apr 23 08:20:49.778267 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:20:49.778210 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-jobset-operator/jobset-operator-747c5859c7-g6ptd" podStartSLOduration=1.138793425 podStartE2EDuration="3.778177544s" podCreationTimestamp="2026-04-23 08:20:46 +0000 UTC" firstStartedPulling="2026-04-23 08:20:47.043567603 +0000 UTC m=+395.991754809" lastFinishedPulling="2026-04-23 08:20:49.68295171 +0000 UTC m=+398.631138928" observedRunningTime="2026-04-23 08:20:49.776408811 +0000 UTC m=+398.724596039" watchObservedRunningTime="2026-04-23 08:20:49.778177544 +0000 UTC m=+398.726364772" Apr 23 08:24:11.526114 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:24:11.526086 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsvjc_ffbe5334-bba8-45bc-bd64-2141ea3f49a8/ovn-acl-logging/0.log" Apr 23 08:24:11.526952 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:24:11.526929 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsvjc_ffbe5334-bba8-45bc-bd64-2141ea3f49a8/ovn-acl-logging/0.log" Apr 23 08:29:11.541896 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:29:11.541810 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsvjc_ffbe5334-bba8-45bc-bd64-2141ea3f49a8/ovn-acl-logging/0.log" Apr 23 08:29:11.543831 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:29:11.543811 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsvjc_ffbe5334-bba8-45bc-bd64-2141ea3f49a8/ovn-acl-logging/0.log" Apr 23 08:34:11.562484 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:34:11.562455 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsvjc_ffbe5334-bba8-45bc-bd64-2141ea3f49a8/ovn-acl-logging/0.log" Apr 23 08:34:11.563925 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:34:11.563905 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsvjc_ffbe5334-bba8-45bc-bd64-2141ea3f49a8/ovn-acl-logging/0.log" Apr 23 08:39:11.578826 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:39:11.578800 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsvjc_ffbe5334-bba8-45bc-bd64-2141ea3f49a8/ovn-acl-logging/0.log" Apr 23 08:39:11.581331 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:39:11.580762 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsvjc_ffbe5334-bba8-45bc-bd64-2141ea3f49a8/ovn-acl-logging/0.log" Apr 23 08:44:11.598578 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:44:11.598459 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsvjc_ffbe5334-bba8-45bc-bd64-2141ea3f49a8/ovn-acl-logging/0.log" Apr 23 08:44:11.602722 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:44:11.600521 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsvjc_ffbe5334-bba8-45bc-bd64-2141ea3f49a8/ovn-acl-logging/0.log" Apr 23 08:49:11.614713 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:49:11.614590 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsvjc_ffbe5334-bba8-45bc-bd64-2141ea3f49a8/ovn-acl-logging/0.log" Apr 23 08:49:11.618630 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:49:11.617876 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsvjc_ffbe5334-bba8-45bc-bd64-2141ea3f49a8/ovn-acl-logging/0.log" Apr 23 08:54:11.630850 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:54:11.630740 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsvjc_ffbe5334-bba8-45bc-bd64-2141ea3f49a8/ovn-acl-logging/0.log" Apr 23 08:54:11.634888 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:54:11.634453 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsvjc_ffbe5334-bba8-45bc-bd64-2141ea3f49a8/ovn-acl-logging/0.log" Apr 23 08:59:11.647864 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:59:11.647755 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsvjc_ffbe5334-bba8-45bc-bd64-2141ea3f49a8/ovn-acl-logging/0.log" Apr 23 08:59:11.651865 ip-10-0-135-129 kubenswrapper[2570]: I0423 08:59:11.651848 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsvjc_ffbe5334-bba8-45bc-bd64-2141ea3f49a8/ovn-acl-logging/0.log" Apr 23 09:02:12.047856 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:02:12.047779 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-qddjf/must-gather-wdgxb"] Apr 23 09:02:12.050755 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:02:12.050739 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qddjf/must-gather-wdgxb" Apr 23 09:02:12.053088 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:02:12.053066 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-qddjf\"/\"openshift-service-ca.crt\"" Apr 23 09:02:12.053228 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:02:12.053074 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-qddjf\"/\"default-dockercfg-gr478\"" Apr 23 09:02:12.053458 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:02:12.053444 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-qddjf\"/\"kube-root-ca.crt\"" Apr 23 09:02:12.066427 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:02:12.066403 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-qddjf/must-gather-wdgxb"] Apr 23 09:02:12.178990 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:02:12.178954 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kq4jw\" (UniqueName: \"kubernetes.io/projected/361de923-d039-452b-a63d-63cb30472175-kube-api-access-kq4jw\") pod \"must-gather-wdgxb\" (UID: \"361de923-d039-452b-a63d-63cb30472175\") " pod="openshift-must-gather-qddjf/must-gather-wdgxb" Apr 23 09:02:12.178990 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:02:12.178993 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/361de923-d039-452b-a63d-63cb30472175-must-gather-output\") pod \"must-gather-wdgxb\" (UID: \"361de923-d039-452b-a63d-63cb30472175\") " pod="openshift-must-gather-qddjf/must-gather-wdgxb" Apr 23 09:02:12.280264 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:02:12.280233 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kq4jw\" (UniqueName: \"kubernetes.io/projected/361de923-d039-452b-a63d-63cb30472175-kube-api-access-kq4jw\") pod \"must-gather-wdgxb\" (UID: \"361de923-d039-452b-a63d-63cb30472175\") " pod="openshift-must-gather-qddjf/must-gather-wdgxb" Apr 23 09:02:12.280264 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:02:12.280274 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/361de923-d039-452b-a63d-63cb30472175-must-gather-output\") pod \"must-gather-wdgxb\" (UID: \"361de923-d039-452b-a63d-63cb30472175\") " pod="openshift-must-gather-qddjf/must-gather-wdgxb" Apr 23 09:02:12.280563 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:02:12.280545 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/361de923-d039-452b-a63d-63cb30472175-must-gather-output\") pod \"must-gather-wdgxb\" (UID: \"361de923-d039-452b-a63d-63cb30472175\") " pod="openshift-must-gather-qddjf/must-gather-wdgxb" Apr 23 09:02:12.288982 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:02:12.288962 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kq4jw\" (UniqueName: \"kubernetes.io/projected/361de923-d039-452b-a63d-63cb30472175-kube-api-access-kq4jw\") pod \"must-gather-wdgxb\" (UID: \"361de923-d039-452b-a63d-63cb30472175\") " pod="openshift-must-gather-qddjf/must-gather-wdgxb" Apr 23 09:02:12.360130 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:02:12.360049 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qddjf/must-gather-wdgxb" Apr 23 09:02:12.476230 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:02:12.476172 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-qddjf/must-gather-wdgxb"] Apr 23 09:02:12.479848 ip-10-0-135-129 kubenswrapper[2570]: W0423 09:02:12.479821 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod361de923_d039_452b_a63d_63cb30472175.slice/crio-e420a41cd0cf9edcb2a03a5a64eb90ddcddebde65f04d5d39eb4982be7fb5ce2 WatchSource:0}: Error finding container e420a41cd0cf9edcb2a03a5a64eb90ddcddebde65f04d5d39eb4982be7fb5ce2: Status 404 returned error can't find the container with id e420a41cd0cf9edcb2a03a5a64eb90ddcddebde65f04d5d39eb4982be7fb5ce2 Apr 23 09:02:12.481430 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:02:12.481411 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 09:02:13.202747 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:02:13.202709 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qddjf/must-gather-wdgxb" event={"ID":"361de923-d039-452b-a63d-63cb30472175","Type":"ContainerStarted","Data":"e420a41cd0cf9edcb2a03a5a64eb90ddcddebde65f04d5d39eb4982be7fb5ce2"} Apr 23 09:02:18.219617 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:02:18.219571 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qddjf/must-gather-wdgxb" event={"ID":"361de923-d039-452b-a63d-63cb30472175","Type":"ContainerStarted","Data":"f7ffc9bc9688d476dbaab3003a8f8aa38df735d1441e6a644d8d51be37fac6a0"} Apr 23 09:02:18.219617 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:02:18.219620 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qddjf/must-gather-wdgxb" event={"ID":"361de923-d039-452b-a63d-63cb30472175","Type":"ContainerStarted","Data":"cbed1a34e2d801d8756ce9be0b6f8c7212c7c632aa941baff09b66ce05e0aa23"} Apr 23 09:02:18.236487 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:02:18.236438 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-qddjf/must-gather-wdgxb" podStartSLOduration=1.277305935 podStartE2EDuration="6.236421927s" podCreationTimestamp="2026-04-23 09:02:12 +0000 UTC" firstStartedPulling="2026-04-23 09:02:12.481541318 +0000 UTC m=+2881.429728526" lastFinishedPulling="2026-04-23 09:02:17.44065731 +0000 UTC m=+2886.388844518" observedRunningTime="2026-04-23 09:02:18.234885724 +0000 UTC m=+2887.183072955" watchObservedRunningTime="2026-04-23 09:02:18.236421927 +0000 UTC m=+2887.184609156" Apr 23 09:03:05.352402 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:03:05.352370 2570 generic.go:358] "Generic (PLEG): container finished" podID="361de923-d039-452b-a63d-63cb30472175" containerID="cbed1a34e2d801d8756ce9be0b6f8c7212c7c632aa941baff09b66ce05e0aa23" exitCode=0 Apr 23 09:03:05.352815 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:03:05.352434 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qddjf/must-gather-wdgxb" event={"ID":"361de923-d039-452b-a63d-63cb30472175","Type":"ContainerDied","Data":"cbed1a34e2d801d8756ce9be0b6f8c7212c7c632aa941baff09b66ce05e0aa23"} Apr 23 09:03:05.352815 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:03:05.352724 2570 scope.go:117] "RemoveContainer" containerID="cbed1a34e2d801d8756ce9be0b6f8c7212c7c632aa941baff09b66ce05e0aa23" Apr 23 09:03:05.391772 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:03:05.391741 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-qddjf_must-gather-wdgxb_361de923-d039-452b-a63d-63cb30472175/gather/0.log" Apr 23 09:03:10.485188 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:03:10.485153 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-kssx7_48ea9757-031f-4dbe-bd48-c51513e48d24/global-pull-secret-syncer/0.log" Apr 23 09:03:10.709929 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:03:10.709898 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-z2ckf_75c70767-827d-46b1-acb4-c76aff02f4bd/konnectivity-agent/0.log" Apr 23 09:03:10.751696 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:03:10.751609 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-qddjf/must-gather-wdgxb"] Apr 23 09:03:10.751886 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:03:10.751848 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-qddjf/must-gather-wdgxb" podUID="361de923-d039-452b-a63d-63cb30472175" containerName="copy" containerID="cri-o://f7ffc9bc9688d476dbaab3003a8f8aa38df735d1441e6a644d8d51be37fac6a0" gracePeriod=2 Apr 23 09:03:10.757543 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:03:10.757520 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-qddjf/must-gather-wdgxb"] Apr 23 09:03:10.838983 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:03:10.838952 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-135-129.ec2.internal_a24ff0e0a63ca37f66f4f5e5712330cc/haproxy/0.log" Apr 23 09:03:10.977166 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:03:10.977141 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-qddjf_must-gather-wdgxb_361de923-d039-452b-a63d-63cb30472175/copy/0.log" Apr 23 09:03:10.977496 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:03:10.977481 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qddjf/must-gather-wdgxb" Apr 23 09:03:10.979482 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:03:10.979458 2570 status_manager.go:895] "Failed to get status for pod" podUID="361de923-d039-452b-a63d-63cb30472175" pod="openshift-must-gather-qddjf/must-gather-wdgxb" err="pods \"must-gather-wdgxb\" is forbidden: User \"system:node:ip-10-0-135-129.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-qddjf\": no relationship found between node 'ip-10-0-135-129.ec2.internal' and this object" Apr 23 09:03:11.050092 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:03:11.050016 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/361de923-d039-452b-a63d-63cb30472175-must-gather-output\") pod \"361de923-d039-452b-a63d-63cb30472175\" (UID: \"361de923-d039-452b-a63d-63cb30472175\") " Apr 23 09:03:11.050092 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:03:11.050057 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kq4jw\" (UniqueName: \"kubernetes.io/projected/361de923-d039-452b-a63d-63cb30472175-kube-api-access-kq4jw\") pod \"361de923-d039-452b-a63d-63cb30472175\" (UID: \"361de923-d039-452b-a63d-63cb30472175\") " Apr 23 09:03:11.052286 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:03:11.052258 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/361de923-d039-452b-a63d-63cb30472175-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "361de923-d039-452b-a63d-63cb30472175" (UID: "361de923-d039-452b-a63d-63cb30472175"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 09:03:11.052384 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:03:11.052266 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/361de923-d039-452b-a63d-63cb30472175-kube-api-access-kq4jw" (OuterVolumeSpecName: "kube-api-access-kq4jw") pod "361de923-d039-452b-a63d-63cb30472175" (UID: "361de923-d039-452b-a63d-63cb30472175"). InnerVolumeSpecName "kube-api-access-kq4jw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 09:03:11.151054 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:03:11.151026 2570 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/361de923-d039-452b-a63d-63cb30472175-must-gather-output\") on node \"ip-10-0-135-129.ec2.internal\" DevicePath \"\"" Apr 23 09:03:11.151054 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:03:11.151053 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kq4jw\" (UniqueName: \"kubernetes.io/projected/361de923-d039-452b-a63d-63cb30472175-kube-api-access-kq4jw\") on node \"ip-10-0-135-129.ec2.internal\" DevicePath \"\"" Apr 23 09:03:11.367897 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:03:11.367819 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-qddjf_must-gather-wdgxb_361de923-d039-452b-a63d-63cb30472175/copy/0.log" Apr 23 09:03:11.368126 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:03:11.368101 2570 generic.go:358] "Generic (PLEG): container finished" podID="361de923-d039-452b-a63d-63cb30472175" containerID="f7ffc9bc9688d476dbaab3003a8f8aa38df735d1441e6a644d8d51be37fac6a0" exitCode=143 Apr 23 09:03:11.368170 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:03:11.368153 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qddjf/must-gather-wdgxb" Apr 23 09:03:11.368233 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:03:11.368215 2570 scope.go:117] "RemoveContainer" containerID="f7ffc9bc9688d476dbaab3003a8f8aa38df735d1441e6a644d8d51be37fac6a0" Apr 23 09:03:11.370362 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:03:11.370324 2570 status_manager.go:895] "Failed to get status for pod" podUID="361de923-d039-452b-a63d-63cb30472175" pod="openshift-must-gather-qddjf/must-gather-wdgxb" err="pods \"must-gather-wdgxb\" is forbidden: User \"system:node:ip-10-0-135-129.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-qddjf\": no relationship found between node 'ip-10-0-135-129.ec2.internal' and this object" Apr 23 09:03:11.375819 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:03:11.375803 2570 scope.go:117] "RemoveContainer" containerID="cbed1a34e2d801d8756ce9be0b6f8c7212c7c632aa941baff09b66ce05e0aa23" Apr 23 09:03:11.378571 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:03:11.378547 2570 status_manager.go:895] "Failed to get status for pod" podUID="361de923-d039-452b-a63d-63cb30472175" pod="openshift-must-gather-qddjf/must-gather-wdgxb" err="pods \"must-gather-wdgxb\" is forbidden: User \"system:node:ip-10-0-135-129.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-qddjf\": no relationship found between node 'ip-10-0-135-129.ec2.internal' and this object" Apr 23 09:03:11.387358 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:03:11.387339 2570 scope.go:117] "RemoveContainer" containerID="f7ffc9bc9688d476dbaab3003a8f8aa38df735d1441e6a644d8d51be37fac6a0" Apr 23 09:03:11.387590 ip-10-0-135-129 kubenswrapper[2570]: E0423 09:03:11.387572 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7ffc9bc9688d476dbaab3003a8f8aa38df735d1441e6a644d8d51be37fac6a0\": container with ID starting with f7ffc9bc9688d476dbaab3003a8f8aa38df735d1441e6a644d8d51be37fac6a0 not found: ID does not exist" containerID="f7ffc9bc9688d476dbaab3003a8f8aa38df735d1441e6a644d8d51be37fac6a0" Apr 23 09:03:11.387624 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:03:11.387598 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7ffc9bc9688d476dbaab3003a8f8aa38df735d1441e6a644d8d51be37fac6a0"} err="failed to get container status \"f7ffc9bc9688d476dbaab3003a8f8aa38df735d1441e6a644d8d51be37fac6a0\": rpc error: code = NotFound desc = could not find container \"f7ffc9bc9688d476dbaab3003a8f8aa38df735d1441e6a644d8d51be37fac6a0\": container with ID starting with f7ffc9bc9688d476dbaab3003a8f8aa38df735d1441e6a644d8d51be37fac6a0 not found: ID does not exist" Apr 23 09:03:11.387624 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:03:11.387619 2570 scope.go:117] "RemoveContainer" containerID="cbed1a34e2d801d8756ce9be0b6f8c7212c7c632aa941baff09b66ce05e0aa23" Apr 23 09:03:11.387853 ip-10-0-135-129 kubenswrapper[2570]: E0423 09:03:11.387836 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbed1a34e2d801d8756ce9be0b6f8c7212c7c632aa941baff09b66ce05e0aa23\": container with ID starting with cbed1a34e2d801d8756ce9be0b6f8c7212c7c632aa941baff09b66ce05e0aa23 not found: ID does not exist" containerID="cbed1a34e2d801d8756ce9be0b6f8c7212c7c632aa941baff09b66ce05e0aa23" Apr 23 09:03:11.387887 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:03:11.387861 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbed1a34e2d801d8756ce9be0b6f8c7212c7c632aa941baff09b66ce05e0aa23"} err="failed to get container status \"cbed1a34e2d801d8756ce9be0b6f8c7212c7c632aa941baff09b66ce05e0aa23\": rpc error: code = NotFound desc = could not find container \"cbed1a34e2d801d8756ce9be0b6f8c7212c7c632aa941baff09b66ce05e0aa23\": container with ID starting with cbed1a34e2d801d8756ce9be0b6f8c7212c7c632aa941baff09b66ce05e0aa23 not found: ID does not exist" Apr 23 09:03:11.589261 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:03:11.589219 2570 status_manager.go:895] "Failed to get status for pod" podUID="361de923-d039-452b-a63d-63cb30472175" pod="openshift-must-gather-qddjf/must-gather-wdgxb" err="pods \"must-gather-wdgxb\" is forbidden: User \"system:node:ip-10-0-135-129.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-qddjf\": no relationship found between node 'ip-10-0-135-129.ec2.internal' and this object" Apr 23 09:03:11.589623 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:03:11.589379 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="361de923-d039-452b-a63d-63cb30472175" path="/var/lib/kubelet/pods/361de923-d039-452b-a63d-63cb30472175/volumes" Apr 23 09:03:14.741617 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:03:14.741588 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-5c259_91edfee8-0edd-47a6-a65f-1cb810bbe9ed/node-exporter/0.log" Apr 23 09:03:14.770922 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:03:14.770898 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-5c259_91edfee8-0edd-47a6-a65f-1cb810bbe9ed/kube-rbac-proxy/0.log" Apr 23 09:03:14.799506 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:03:14.799482 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-5c259_91edfee8-0edd-47a6-a65f-1cb810bbe9ed/init-textfile/0.log" Apr 23 09:03:16.771176 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:03:16.771142 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-9zlnz_ee86940d-8ae8-4a41-80ec-b0743181280c/networking-console-plugin/0.log" Apr 23 09:03:17.803273 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:03:17.803240 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-tmjn8/perf-node-gather-daemonset-z24fl"] Apr 23 09:03:17.803657 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:03:17.803480 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="361de923-d039-452b-a63d-63cb30472175" containerName="gather" Apr 23 09:03:17.803657 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:03:17.803492 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="361de923-d039-452b-a63d-63cb30472175" containerName="gather" Apr 23 09:03:17.803657 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:03:17.803503 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="361de923-d039-452b-a63d-63cb30472175" containerName="copy" Apr 23 09:03:17.803657 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:03:17.803509 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="361de923-d039-452b-a63d-63cb30472175" containerName="copy" Apr 23 09:03:17.803657 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:03:17.803552 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="361de923-d039-452b-a63d-63cb30472175" containerName="gather" Apr 23 09:03:17.803657 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:03:17.803560 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="361de923-d039-452b-a63d-63cb30472175" containerName="copy" Apr 23 09:03:17.808687 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:03:17.808669 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tmjn8/perf-node-gather-daemonset-z24fl" Apr 23 09:03:17.811053 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:03:17.811030 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-tmjn8\"/\"openshift-service-ca.crt\"" Apr 23 09:03:17.811350 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:03:17.811333 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-tmjn8\"/\"kube-root-ca.crt\"" Apr 23 09:03:17.811403 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:03:17.811333 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-tmjn8\"/\"default-dockercfg-pjq8w\"" Apr 23 09:03:17.824864 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:03:17.820760 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-tmjn8/perf-node-gather-daemonset-z24fl"] Apr 23 09:03:17.902102 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:03:17.902062 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/c3eaea9a-bb0a-46b9-8dba-ded57db6ebe6-proc\") pod \"perf-node-gather-daemonset-z24fl\" (UID: \"c3eaea9a-bb0a-46b9-8dba-ded57db6ebe6\") " pod="openshift-must-gather-tmjn8/perf-node-gather-daemonset-z24fl" Apr 23 09:03:17.902102 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:03:17.902106 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/c3eaea9a-bb0a-46b9-8dba-ded57db6ebe6-podres\") pod \"perf-node-gather-daemonset-z24fl\" (UID: \"c3eaea9a-bb0a-46b9-8dba-ded57db6ebe6\") " pod="openshift-must-gather-tmjn8/perf-node-gather-daemonset-z24fl" Apr 23 09:03:17.902343 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:03:17.902127 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c3eaea9a-bb0a-46b9-8dba-ded57db6ebe6-sys\") pod \"perf-node-gather-daemonset-z24fl\" (UID: \"c3eaea9a-bb0a-46b9-8dba-ded57db6ebe6\") " pod="openshift-must-gather-tmjn8/perf-node-gather-daemonset-z24fl" Apr 23 09:03:17.902343 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:03:17.902146 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c3eaea9a-bb0a-46b9-8dba-ded57db6ebe6-lib-modules\") pod \"perf-node-gather-daemonset-z24fl\" (UID: \"c3eaea9a-bb0a-46b9-8dba-ded57db6ebe6\") " pod="openshift-must-gather-tmjn8/perf-node-gather-daemonset-z24fl" Apr 23 09:03:17.902343 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:03:17.902184 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcq6h\" (UniqueName: \"kubernetes.io/projected/c3eaea9a-bb0a-46b9-8dba-ded57db6ebe6-kube-api-access-qcq6h\") pod \"perf-node-gather-daemonset-z24fl\" (UID: \"c3eaea9a-bb0a-46b9-8dba-ded57db6ebe6\") " pod="openshift-must-gather-tmjn8/perf-node-gather-daemonset-z24fl" Apr 23 09:03:18.002723 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:03:18.002690 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c3eaea9a-bb0a-46b9-8dba-ded57db6ebe6-sys\") pod \"perf-node-gather-daemonset-z24fl\" (UID: \"c3eaea9a-bb0a-46b9-8dba-ded57db6ebe6\") " pod="openshift-must-gather-tmjn8/perf-node-gather-daemonset-z24fl" Apr 23 09:03:18.002723 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:03:18.002722 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c3eaea9a-bb0a-46b9-8dba-ded57db6ebe6-lib-modules\") pod \"perf-node-gather-daemonset-z24fl\" (UID: \"c3eaea9a-bb0a-46b9-8dba-ded57db6ebe6\") " pod="openshift-must-gather-tmjn8/perf-node-gather-daemonset-z24fl" Apr 23 09:03:18.002935 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:03:18.002742 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qcq6h\" (UniqueName: \"kubernetes.io/projected/c3eaea9a-bb0a-46b9-8dba-ded57db6ebe6-kube-api-access-qcq6h\") pod \"perf-node-gather-daemonset-z24fl\" (UID: \"c3eaea9a-bb0a-46b9-8dba-ded57db6ebe6\") " pod="openshift-must-gather-tmjn8/perf-node-gather-daemonset-z24fl" Apr 23 09:03:18.002935 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:03:18.002778 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/c3eaea9a-bb0a-46b9-8dba-ded57db6ebe6-proc\") pod \"perf-node-gather-daemonset-z24fl\" (UID: \"c3eaea9a-bb0a-46b9-8dba-ded57db6ebe6\") " pod="openshift-must-gather-tmjn8/perf-node-gather-daemonset-z24fl" Apr 23 09:03:18.002935 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:03:18.002812 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/c3eaea9a-bb0a-46b9-8dba-ded57db6ebe6-podres\") pod \"perf-node-gather-daemonset-z24fl\" (UID: \"c3eaea9a-bb0a-46b9-8dba-ded57db6ebe6\") " pod="openshift-must-gather-tmjn8/perf-node-gather-daemonset-z24fl" Apr 23 09:03:18.002935 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:03:18.002844 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c3eaea9a-bb0a-46b9-8dba-ded57db6ebe6-sys\") pod \"perf-node-gather-daemonset-z24fl\" (UID: \"c3eaea9a-bb0a-46b9-8dba-ded57db6ebe6\") " pod="openshift-must-gather-tmjn8/perf-node-gather-daemonset-z24fl" Apr 23 09:03:18.002935 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:03:18.002890 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c3eaea9a-bb0a-46b9-8dba-ded57db6ebe6-lib-modules\") pod \"perf-node-gather-daemonset-z24fl\" (UID: \"c3eaea9a-bb0a-46b9-8dba-ded57db6ebe6\") " pod="openshift-must-gather-tmjn8/perf-node-gather-daemonset-z24fl" Apr 23 09:03:18.002935 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:03:18.002916 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/c3eaea9a-bb0a-46b9-8dba-ded57db6ebe6-podres\") pod \"perf-node-gather-daemonset-z24fl\" (UID: \"c3eaea9a-bb0a-46b9-8dba-ded57db6ebe6\") " pod="openshift-must-gather-tmjn8/perf-node-gather-daemonset-z24fl" Apr 23 09:03:18.002935 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:03:18.002932 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/c3eaea9a-bb0a-46b9-8dba-ded57db6ebe6-proc\") pod \"perf-node-gather-daemonset-z24fl\" (UID: \"c3eaea9a-bb0a-46b9-8dba-ded57db6ebe6\") " pod="openshift-must-gather-tmjn8/perf-node-gather-daemonset-z24fl" Apr 23 09:03:18.011970 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:03:18.011941 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcq6h\" (UniqueName: \"kubernetes.io/projected/c3eaea9a-bb0a-46b9-8dba-ded57db6ebe6-kube-api-access-qcq6h\") pod \"perf-node-gather-daemonset-z24fl\" (UID: \"c3eaea9a-bb0a-46b9-8dba-ded57db6ebe6\") " pod="openshift-must-gather-tmjn8/perf-node-gather-daemonset-z24fl" Apr 23 09:03:18.117706 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:03:18.117627 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tmjn8/perf-node-gather-daemonset-z24fl" Apr 23 09:03:18.235015 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:03:18.234972 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-tmjn8/perf-node-gather-daemonset-z24fl"] Apr 23 09:03:18.237974 ip-10-0-135-129 kubenswrapper[2570]: W0423 09:03:18.237945 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podc3eaea9a_bb0a_46b9_8dba_ded57db6ebe6.slice/crio-140c4a94957e80d3175b7f1d658cf9942ec3719174443f345838617c200adcac WatchSource:0}: Error finding container 140c4a94957e80d3175b7f1d658cf9942ec3719174443f345838617c200adcac: Status 404 returned error can't find the container with id 140c4a94957e80d3175b7f1d658cf9942ec3719174443f345838617c200adcac Apr 23 09:03:18.389315 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:03:18.389225 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tmjn8/perf-node-gather-daemonset-z24fl" event={"ID":"c3eaea9a-bb0a-46b9-8dba-ded57db6ebe6","Type":"ContainerStarted","Data":"f558f2eb05c007fb31cf6af723f53fee71e97c7bc1d4ac3828d4df151a1607a2"} Apr 23 09:03:18.389315 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:03:18.389260 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tmjn8/perf-node-gather-daemonset-z24fl" event={"ID":"c3eaea9a-bb0a-46b9-8dba-ded57db6ebe6","Type":"ContainerStarted","Data":"140c4a94957e80d3175b7f1d658cf9942ec3719174443f345838617c200adcac"} Apr 23 09:03:18.389488 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:03:18.389348 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-tmjn8/perf-node-gather-daemonset-z24fl" Apr 23 09:03:18.405536 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:03:18.405487 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-tmjn8/perf-node-gather-daemonset-z24fl" podStartSLOduration=1.405473745 podStartE2EDuration="1.405473745s" podCreationTimestamp="2026-04-23 09:03:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 09:03:18.404295458 +0000 UTC m=+2947.352482719" watchObservedRunningTime="2026-04-23 09:03:18.405473745 +0000 UTC m=+2947.353660973" Apr 23 09:03:18.707698 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:03:18.707615 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-2lvrc_52a0af2f-2394-4095-ac7f-6cc85c59c3a7/dns/0.log" Apr 23 09:03:18.733240 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:03:18.733183 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-2lvrc_52a0af2f-2394-4095-ac7f-6cc85c59c3a7/kube-rbac-proxy/0.log" Apr 23 09:03:18.922825 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:03:18.922797 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-mslvd_9f4d9f29-8efc-4799-b966-20f9d049fd33/dns-node-resolver/0.log" Apr 23 09:03:19.487967 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:03:19.487936 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-fgmqg_749e4605-b37e-4004-a2dc-68092884ddae/node-ca/0.log" Apr 23 09:03:20.613047 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:03:20.613013 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-pmw99_bdabc8c7-71d8-4169-b0d7-8304d3f874d8/serve-healthcheck-canary/0.log" Apr 23 09:03:21.075839 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:03:21.075812 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-b2s4p_054d81ed-b5d4-4a4d-b77f-47078e718470/kube-rbac-proxy/0.log" Apr 23 09:03:21.101873 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:03:21.101845 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-b2s4p_054d81ed-b5d4-4a4d-b77f-47078e718470/exporter/0.log" Apr 23 09:03:21.127108 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:03:21.127076 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-b2s4p_054d81ed-b5d4-4a4d-b77f-47078e718470/extractor/0.log" Apr 23 09:03:22.924766 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:03:22.924732 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-jobset-operator_jobset-operator-747c5859c7-g6ptd_2a8fe58b-a624-41ac-9cf4-b9dba545dd91/jobset-operator/0.log" Apr 23 09:03:24.400755 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:03:24.400729 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-tmjn8/perf-node-gather-daemonset-z24fl" Apr 23 09:03:28.164694 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:03:28.164665 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rctnw_575d4e03-f407-47ca-9efc-8c7bee335d30/kube-multus-additional-cni-plugins/0.log" Apr 23 09:03:28.190761 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:03:28.190735 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rctnw_575d4e03-f407-47ca-9efc-8c7bee335d30/egress-router-binary-copy/0.log" Apr 23 09:03:28.219289 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:03:28.219260 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rctnw_575d4e03-f407-47ca-9efc-8c7bee335d30/cni-plugins/0.log" Apr 23 09:03:28.247242 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:03:28.247222 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rctnw_575d4e03-f407-47ca-9efc-8c7bee335d30/bond-cni-plugin/0.log" Apr 23 09:03:28.273906 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:03:28.273887 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rctnw_575d4e03-f407-47ca-9efc-8c7bee335d30/routeoverride-cni/0.log" Apr 23 09:03:28.299500 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:03:28.299441 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rctnw_575d4e03-f407-47ca-9efc-8c7bee335d30/whereabouts-cni-bincopy/0.log" Apr 23 09:03:28.326679 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:03:28.326649 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rctnw_575d4e03-f407-47ca-9efc-8c7bee335d30/whereabouts-cni/0.log" Apr 23 09:03:28.395275 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:03:28.395253 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-hrjxz_f033049a-1b87-451a-a1fc-53b7ebf036df/kube-multus/0.log" Apr 23 09:03:28.453551 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:03:28.453520 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-4gzjb_5e2ddc62-96a8-4827-87cc-a0b37c9a5e9f/network-metrics-daemon/0.log" Apr 23 09:03:28.478882 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:03:28.478857 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-4gzjb_5e2ddc62-96a8-4827-87cc-a0b37c9a5e9f/kube-rbac-proxy/0.log" Apr 23 09:03:29.736367 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:03:29.736339 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsvjc_ffbe5334-bba8-45bc-bd64-2141ea3f49a8/ovn-controller/0.log" Apr 23 09:03:29.764746 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:03:29.764723 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsvjc_ffbe5334-bba8-45bc-bd64-2141ea3f49a8/ovn-acl-logging/0.log" Apr 23 09:03:29.776672 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:03:29.776649 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsvjc_ffbe5334-bba8-45bc-bd64-2141ea3f49a8/ovn-acl-logging/1.log" Apr 23 09:03:29.801363 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:03:29.801316 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsvjc_ffbe5334-bba8-45bc-bd64-2141ea3f49a8/kube-rbac-proxy-node/0.log" Apr 23 09:03:29.824912 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:03:29.824885 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsvjc_ffbe5334-bba8-45bc-bd64-2141ea3f49a8/kube-rbac-proxy-ovn-metrics/0.log" Apr 23 09:03:29.848409 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:03:29.848376 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsvjc_ffbe5334-bba8-45bc-bd64-2141ea3f49a8/northd/0.log" Apr 23 09:03:29.872723 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:03:29.872700 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsvjc_ffbe5334-bba8-45bc-bd64-2141ea3f49a8/nbdb/0.log" Apr 23 09:03:29.898569 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:03:29.898542 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsvjc_ffbe5334-bba8-45bc-bd64-2141ea3f49a8/sbdb/0.log" Apr 23 09:03:29.990445 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:03:29.990368 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsvjc_ffbe5334-bba8-45bc-bd64-2141ea3f49a8/ovnkube-controller/0.log" Apr 23 09:03:31.658171 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:03:31.658143 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-wrt6p_93ac6a8e-10aa-4687-be43-6d712bee9ebd/network-check-target-container/0.log" Apr 23 09:03:32.582995 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:03:32.582965 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-7wvqt_887c87c3-07ae-4b74-aa64-fe19546746e0/iptables-alerter/0.log" Apr 23 09:03:33.295853 ip-10-0-135-129 kubenswrapper[2570]: I0423 09:03:33.295823 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-cxzv6_fca14dff-5b1c-41d2-adc2-0d42ef722a54/tuned/0.log"