Apr 23 08:45:27.009811 ip-10-0-131-47 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 23 08:45:27.009825 ip-10-0-131-47 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 23 08:45:27.009834 ip-10-0-131-47 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 23 08:45:27.010139 ip-10-0-131-47 systemd[1]: Failed to start Kubernetes Kubelet. Apr 23 08:45:37.189920 ip-10-0-131-47 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 23 08:45:37.189940 ip-10-0-131-47 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 197a9f836d984282a49e544df100c7e9 -- Apr 23 08:48:04.360703 ip-10-0-131-47 systemd[1]: Starting Kubernetes Kubelet... Apr 23 08:48:04.773859 ip-10-0-131-47 kubenswrapper[2579]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 08:48:04.773859 ip-10-0-131-47 kubenswrapper[2579]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 23 08:48:04.773859 ip-10-0-131-47 kubenswrapper[2579]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 08:48:04.773859 ip-10-0-131-47 kubenswrapper[2579]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 23 08:48:04.773859 ip-10-0-131-47 kubenswrapper[2579]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 08:48:04.775320 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.775218 2579 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 23 08:48:04.778527 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.778513 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 08:48:04.778527 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.778528 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 08:48:04.778586 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.778532 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 08:48:04.778586 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.778535 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 08:48:04.778586 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.778538 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 08:48:04.778586 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.778541 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 08:48:04.778586 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.778543 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 08:48:04.778586 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.778546 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 08:48:04.778586 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.778549 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 08:48:04.778586 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.778553 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 08:48:04.778586 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.778557 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 08:48:04.778586 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.778560 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 08:48:04.778586 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.778562 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 08:48:04.778586 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.778565 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 08:48:04.778586 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.778568 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 08:48:04.778586 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.778571 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 08:48:04.778586 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.778574 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 08:48:04.778586 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.778576 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 08:48:04.778586 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.778579 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 08:48:04.778586 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.778583 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 08:48:04.778586 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.778585 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 08:48:04.779042 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.778588 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 08:48:04.779042 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.778591 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 08:48:04.779042 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.778593 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 08:48:04.779042 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.778597 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 08:48:04.779042 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.778600 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 08:48:04.779042 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.778602 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 08:48:04.779042 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.778605 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 08:48:04.779042 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.778607 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 08:48:04.779042 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.778611 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 08:48:04.779042 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.778613 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 08:48:04.779042 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.778616 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 08:48:04.779042 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.778618 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 08:48:04.779042 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.778621 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 08:48:04.779042 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.778623 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 08:48:04.779042 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.778626 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 08:48:04.779042 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.778628 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 08:48:04.779042 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.778631 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 08:48:04.779042 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.778634 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 08:48:04.779042 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.778636 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 08:48:04.779042 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.778639 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 08:48:04.779557 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.778641 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 08:48:04.779557 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.778644 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 08:48:04.779557 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.778646 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 08:48:04.779557 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.778648 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 08:48:04.779557 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.778651 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 08:48:04.779557 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.778653 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 08:48:04.779557 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.778655 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 08:48:04.779557 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.778658 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 08:48:04.779557 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.778660 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 08:48:04.779557 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.778663 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 08:48:04.779557 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.778665 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 08:48:04.779557 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.778668 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 08:48:04.779557 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.778671 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 08:48:04.779557 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.778674 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 08:48:04.779557 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.778676 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 08:48:04.779557 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.778679 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 08:48:04.779557 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.778681 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 08:48:04.779557 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.778683 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 08:48:04.779557 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.778686 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 08:48:04.780010 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.778688 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 08:48:04.780010 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.778691 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 08:48:04.780010 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.778693 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 08:48:04.780010 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.778696 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 08:48:04.780010 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.778698 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 08:48:04.780010 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.778701 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 08:48:04.780010 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.778705 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 08:48:04.780010 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.778709 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 08:48:04.780010 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.778712 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 08:48:04.780010 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.778715 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 08:48:04.780010 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.778717 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 08:48:04.780010 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.778720 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 08:48:04.780010 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.778722 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 08:48:04.780010 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.778725 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 08:48:04.780010 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.778727 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 08:48:04.780010 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.778730 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 08:48:04.780010 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.778732 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 08:48:04.780010 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.778735 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 08:48:04.780010 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.778738 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 08:48:04.780010 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.778740 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 08:48:04.780620 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.778743 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 08:48:04.780620 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.778745 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 08:48:04.780620 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.778749 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 08:48:04.780620 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.778752 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 23 08:48:04.780620 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.778755 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 08:48:04.780620 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.778757 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 08:48:04.780620 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.779150 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 08:48:04.780620 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.779155 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 08:48:04.780620 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.779158 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 08:48:04.780620 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.779160 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 08:48:04.780620 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.779163 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 08:48:04.780620 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.779166 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 08:48:04.780620 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.779168 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 08:48:04.780620 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.779171 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 08:48:04.780620 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.779173 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 08:48:04.780620 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.779176 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 08:48:04.780620 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.779184 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 08:48:04.780620 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.779187 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 08:48:04.780620 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.779189 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 08:48:04.780620 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.779192 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 08:48:04.781090 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.779195 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 08:48:04.781090 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.779199 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 08:48:04.781090 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.779202 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 23 08:48:04.781090 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.779205 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 08:48:04.781090 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.779208 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 08:48:04.781090 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.779210 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 08:48:04.781090 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.779213 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 08:48:04.781090 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.779216 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 08:48:04.781090 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.779218 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 08:48:04.781090 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.779221 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 08:48:04.781090 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.779223 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 08:48:04.781090 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.779225 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 08:48:04.781090 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.779228 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 08:48:04.781090 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.779230 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 08:48:04.781090 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.779232 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 08:48:04.781090 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.779235 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 08:48:04.781090 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.779238 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 08:48:04.781090 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.779240 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 08:48:04.781090 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.779243 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 08:48:04.781576 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.779246 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 08:48:04.781576 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.779248 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 08:48:04.781576 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.779251 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 08:48:04.781576 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.779253 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 08:48:04.781576 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.779256 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 08:48:04.781576 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.779258 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 08:48:04.781576 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.779261 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 08:48:04.781576 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.779263 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 08:48:04.781576 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.779266 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 08:48:04.781576 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.779268 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 08:48:04.781576 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.779271 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 08:48:04.781576 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.779274 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 08:48:04.781576 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.779276 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 08:48:04.781576 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.779279 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 08:48:04.781576 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.779282 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 08:48:04.781576 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.779284 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 08:48:04.781576 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.779287 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 08:48:04.781576 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.779289 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 08:48:04.781576 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.779291 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 08:48:04.781576 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.779294 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 08:48:04.782062 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.779296 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 08:48:04.782062 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.779299 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 08:48:04.782062 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.779301 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 08:48:04.782062 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.779304 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 08:48:04.782062 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.779306 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 08:48:04.782062 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.779308 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 08:48:04.782062 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.779311 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 08:48:04.782062 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.779313 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 08:48:04.782062 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.779316 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 08:48:04.782062 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.779318 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 08:48:04.782062 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.779321 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 08:48:04.782062 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.779324 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 08:48:04.782062 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.779327 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 08:48:04.782062 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.779329 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 08:48:04.782062 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.779332 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 08:48:04.782062 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.779334 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 08:48:04.782062 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.779353 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 08:48:04.782062 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.779359 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 08:48:04.782062 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.779363 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 08:48:04.782062 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.779366 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 08:48:04.782557 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.779369 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 08:48:04.782557 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.779371 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 08:48:04.782557 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.779374 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 08:48:04.782557 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.779377 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 08:48:04.782557 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.779380 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 08:48:04.782557 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.779382 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 08:48:04.782557 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.779386 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 08:48:04.782557 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.779390 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 08:48:04.782557 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.779392 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 08:48:04.782557 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.779395 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 08:48:04.782557 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.779398 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 08:48:04.782557 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.779401 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 08:48:04.782557 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.779403 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 08:48:04.782557 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779484 2579 flags.go:64] FLAG: --address="0.0.0.0" Apr 23 08:48:04.782557 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779507 2579 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 23 08:48:04.782557 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779515 2579 flags.go:64] FLAG: --anonymous-auth="true" Apr 23 08:48:04.782557 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779519 2579 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 23 08:48:04.782557 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779524 2579 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 23 08:48:04.782557 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779527 2579 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 23 08:48:04.782557 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779531 2579 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 23 08:48:04.782557 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779536 2579 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 23 08:48:04.783052 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779540 2579 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 23 08:48:04.783052 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779543 2579 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 23 08:48:04.783052 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779547 2579 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 23 08:48:04.783052 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779550 2579 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 23 08:48:04.783052 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779553 2579 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 23 08:48:04.783052 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779556 2579 flags.go:64] FLAG: --cgroup-root="" Apr 23 08:48:04.783052 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779559 2579 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 23 08:48:04.783052 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779563 2579 flags.go:64] FLAG: --client-ca-file="" Apr 23 08:48:04.783052 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779566 2579 flags.go:64] FLAG: --cloud-config="" Apr 23 08:48:04.783052 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779568 2579 flags.go:64] FLAG: --cloud-provider="external" Apr 23 08:48:04.783052 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779571 2579 flags.go:64] FLAG: --cluster-dns="[]" Apr 23 08:48:04.783052 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779576 2579 flags.go:64] FLAG: --cluster-domain="" Apr 23 08:48:04.783052 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779579 2579 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 23 08:48:04.783052 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779582 2579 flags.go:64] FLAG: --config-dir="" Apr 23 08:48:04.783052 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779585 2579 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 23 08:48:04.783052 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779594 2579 flags.go:64] FLAG: --container-log-max-files="5" Apr 23 08:48:04.783052 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779598 2579 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 23 08:48:04.783052 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779601 2579 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 23 08:48:04.783052 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779604 2579 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 23 08:48:04.783052 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779608 2579 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 23 08:48:04.783052 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779610 2579 flags.go:64] FLAG: --contention-profiling="false" Apr 23 08:48:04.783052 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779613 2579 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 23 08:48:04.783052 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779616 2579 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 23 08:48:04.783052 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779619 2579 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 23 08:48:04.783052 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779622 2579 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 23 08:48:04.783706 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779626 2579 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 23 08:48:04.783706 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779629 2579 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 23 08:48:04.783706 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779632 2579 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 23 08:48:04.783706 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779635 2579 flags.go:64] FLAG: --enable-load-reader="false" Apr 23 08:48:04.783706 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779638 2579 flags.go:64] FLAG: --enable-server="true" Apr 23 08:48:04.783706 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779641 2579 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 23 08:48:04.783706 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779644 2579 flags.go:64] FLAG: --event-burst="100" Apr 23 08:48:04.783706 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779648 2579 flags.go:64] FLAG: --event-qps="50" Apr 23 08:48:04.783706 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779651 2579 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 23 08:48:04.783706 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779654 2579 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 23 08:48:04.783706 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779657 2579 flags.go:64] FLAG: --eviction-hard="" Apr 23 08:48:04.783706 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779661 2579 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 23 08:48:04.783706 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779664 2579 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 23 08:48:04.783706 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779667 2579 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 23 08:48:04.783706 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779670 2579 flags.go:64] FLAG: --eviction-soft="" Apr 23 08:48:04.783706 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779673 2579 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 23 08:48:04.783706 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779676 2579 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 23 08:48:04.783706 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779679 2579 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 23 08:48:04.783706 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779682 2579 flags.go:64] FLAG: --experimental-mounter-path="" Apr 23 08:48:04.783706 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779685 2579 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 23 08:48:04.783706 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779687 2579 flags.go:64] FLAG: --fail-swap-on="true" Apr 23 08:48:04.783706 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779690 2579 flags.go:64] FLAG: --feature-gates="" Apr 23 08:48:04.783706 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779694 2579 flags.go:64] FLAG: --file-check-frequency="20s" Apr 23 08:48:04.783706 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779698 2579 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 23 08:48:04.783706 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779701 2579 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 23 08:48:04.784297 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779704 2579 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 23 08:48:04.784297 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779707 2579 flags.go:64] FLAG: --healthz-port="10248" Apr 23 08:48:04.784297 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779710 2579 flags.go:64] FLAG: --help="false" Apr 23 08:48:04.784297 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779713 2579 flags.go:64] FLAG: --hostname-override="ip-10-0-131-47.ec2.internal" Apr 23 08:48:04.784297 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779718 2579 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 23 08:48:04.784297 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779722 2579 flags.go:64] FLAG: --http-check-frequency="20s" Apr 23 08:48:04.784297 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779724 2579 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 23 08:48:04.784297 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779728 2579 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 23 08:48:04.784297 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779731 2579 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 23 08:48:04.784297 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779734 2579 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 23 08:48:04.784297 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779737 2579 flags.go:64] FLAG: --image-service-endpoint="" Apr 23 08:48:04.784297 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779740 2579 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 23 08:48:04.784297 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779743 2579 flags.go:64] FLAG: --kube-api-burst="100" Apr 23 08:48:04.784297 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779745 2579 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 23 08:48:04.784297 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779748 2579 flags.go:64] FLAG: --kube-api-qps="50" Apr 23 08:48:04.784297 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779751 2579 flags.go:64] FLAG: --kube-reserved="" Apr 23 08:48:04.784297 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779754 2579 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 23 08:48:04.784297 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779757 2579 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 23 08:48:04.784297 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779760 2579 flags.go:64] FLAG: --kubelet-cgroups="" Apr 23 08:48:04.784297 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779764 2579 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 23 08:48:04.784297 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779767 2579 flags.go:64] FLAG: --lock-file="" Apr 23 08:48:04.784297 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779769 2579 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 23 08:48:04.784297 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779772 2579 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 23 08:48:04.784297 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779775 2579 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 23 08:48:04.784910 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779781 2579 flags.go:64] FLAG: --log-json-split-stream="false" Apr 23 08:48:04.784910 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779783 2579 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 23 08:48:04.784910 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779786 2579 flags.go:64] FLAG: --log-text-split-stream="false" Apr 23 08:48:04.784910 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779789 2579 flags.go:64] FLAG: --logging-format="text" Apr 23 08:48:04.784910 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779792 2579 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 23 08:48:04.784910 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779796 2579 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 23 08:48:04.784910 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779799 2579 flags.go:64] FLAG: --manifest-url="" Apr 23 08:48:04.784910 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779803 2579 flags.go:64] FLAG: --manifest-url-header="" Apr 23 08:48:04.784910 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779807 2579 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 23 08:48:04.784910 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779810 2579 flags.go:64] FLAG: --max-open-files="1000000" Apr 23 08:48:04.784910 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779814 2579 flags.go:64] FLAG: --max-pods="110" Apr 23 08:48:04.784910 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779817 2579 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 23 08:48:04.784910 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779820 2579 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 23 08:48:04.784910 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779823 2579 flags.go:64] FLAG: --memory-manager-policy="None" Apr 23 08:48:04.784910 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779825 2579 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 23 08:48:04.784910 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779828 2579 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 23 08:48:04.784910 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779831 2579 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 23 08:48:04.784910 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779834 2579 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 23 08:48:04.784910 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779841 2579 flags.go:64] FLAG: --node-status-max-images="50" Apr 23 08:48:04.784910 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779844 2579 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 23 08:48:04.784910 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779847 2579 flags.go:64] FLAG: --oom-score-adj="-999" Apr 23 08:48:04.784910 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779850 2579 flags.go:64] FLAG: --pod-cidr="" Apr 23 08:48:04.784910 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779853 2579 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 23 08:48:04.785484 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779858 2579 flags.go:64] FLAG: --pod-manifest-path="" Apr 23 08:48:04.785484 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779862 2579 flags.go:64] FLAG: --pod-max-pids="-1" Apr 23 08:48:04.785484 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779865 2579 flags.go:64] FLAG: --pods-per-core="0" Apr 23 08:48:04.785484 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779868 2579 flags.go:64] FLAG: --port="10250" Apr 23 08:48:04.785484 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779871 2579 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 23 08:48:04.785484 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779874 2579 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-023f7e5e0c40f1cee" Apr 23 08:48:04.785484 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779878 2579 flags.go:64] FLAG: --qos-reserved="" Apr 23 08:48:04.785484 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779880 2579 flags.go:64] FLAG: --read-only-port="10255" Apr 23 08:48:04.785484 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779883 2579 flags.go:64] FLAG: --register-node="true" Apr 23 08:48:04.785484 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779886 2579 flags.go:64] FLAG: --register-schedulable="true" Apr 23 08:48:04.785484 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779889 2579 flags.go:64] FLAG: --register-with-taints="" Apr 23 08:48:04.785484 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779893 2579 flags.go:64] FLAG: --registry-burst="10" Apr 23 08:48:04.785484 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779895 2579 flags.go:64] FLAG: --registry-qps="5" Apr 23 08:48:04.785484 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779898 2579 flags.go:64] FLAG: --reserved-cpus="" Apr 23 08:48:04.785484 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779901 2579 flags.go:64] FLAG: --reserved-memory="" Apr 23 08:48:04.785484 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779904 2579 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 23 08:48:04.785484 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779907 2579 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 23 08:48:04.785484 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779911 2579 flags.go:64] FLAG: --rotate-certificates="false" Apr 23 08:48:04.785484 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779914 2579 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 23 08:48:04.785484 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779917 2579 flags.go:64] FLAG: --runonce="false" Apr 23 08:48:04.785484 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779920 2579 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 23 08:48:04.785484 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779923 2579 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 23 08:48:04.785484 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779926 2579 flags.go:64] FLAG: --seccomp-default="false" Apr 23 08:48:04.785484 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779929 2579 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 23 08:48:04.785484 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779931 2579 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 23 08:48:04.785484 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779935 2579 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 23 08:48:04.786098 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779938 2579 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 23 08:48:04.786098 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779941 2579 flags.go:64] FLAG: --storage-driver-password="root" Apr 23 08:48:04.786098 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779943 2579 flags.go:64] FLAG: --storage-driver-secure="false" Apr 23 08:48:04.786098 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779946 2579 flags.go:64] FLAG: --storage-driver-table="stats" Apr 23 08:48:04.786098 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779949 2579 flags.go:64] FLAG: --storage-driver-user="root" Apr 23 08:48:04.786098 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779952 2579 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 23 08:48:04.786098 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779955 2579 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 23 08:48:04.786098 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779958 2579 flags.go:64] FLAG: --system-cgroups="" Apr 23 08:48:04.786098 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779961 2579 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 23 08:48:04.786098 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779967 2579 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 23 08:48:04.786098 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779970 2579 flags.go:64] FLAG: --tls-cert-file="" Apr 23 08:48:04.786098 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779972 2579 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 23 08:48:04.786098 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779977 2579 flags.go:64] FLAG: --tls-min-version="" Apr 23 08:48:04.786098 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779979 2579 flags.go:64] FLAG: --tls-private-key-file="" Apr 23 08:48:04.786098 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779982 2579 flags.go:64] FLAG: --topology-manager-policy="none" Apr 23 08:48:04.786098 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779985 2579 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 23 08:48:04.786098 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779988 2579 flags.go:64] FLAG: --topology-manager-scope="container" Apr 23 08:48:04.786098 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779991 2579 flags.go:64] FLAG: --v="2" Apr 23 08:48:04.786098 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779995 2579 flags.go:64] FLAG: --version="false" Apr 23 08:48:04.786098 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.779999 2579 flags.go:64] FLAG: --vmodule="" Apr 23 08:48:04.786098 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.780004 2579 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 23 08:48:04.786098 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.780007 2579 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 23 08:48:04.786098 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.780105 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 08:48:04.786098 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.780110 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 08:48:04.786673 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.780119 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 08:48:04.786673 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.780123 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 08:48:04.786673 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.780126 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 08:48:04.786673 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.780128 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 08:48:04.786673 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.780131 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 08:48:04.786673 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.780134 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 08:48:04.786673 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.780137 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 08:48:04.786673 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.780142 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 08:48:04.786673 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.780145 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 08:48:04.786673 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.780148 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 08:48:04.786673 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.780151 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 08:48:04.786673 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.780154 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 08:48:04.786673 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.780157 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 08:48:04.786673 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.780159 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 08:48:04.786673 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.780162 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 08:48:04.786673 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.780167 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 08:48:04.786673 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.780169 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 08:48:04.786673 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.780172 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 08:48:04.786673 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.780174 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 08:48:04.786673 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.780177 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 08:48:04.787259 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.780179 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 08:48:04.787259 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.780182 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 08:48:04.787259 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.780185 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 08:48:04.787259 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.780187 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 08:48:04.787259 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.780190 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 08:48:04.787259 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.780192 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 08:48:04.787259 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.780195 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 08:48:04.787259 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.780197 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 08:48:04.787259 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.780200 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 08:48:04.787259 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.780202 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 08:48:04.787259 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.780205 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 08:48:04.787259 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.780207 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 08:48:04.787259 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.780210 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 08:48:04.787259 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.780212 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 08:48:04.787259 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.780215 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 08:48:04.787259 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.780218 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 08:48:04.787259 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.780220 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 23 08:48:04.787259 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.780223 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 08:48:04.787259 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.780225 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 08:48:04.787259 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.780229 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 08:48:04.787776 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.780231 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 08:48:04.787776 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.780234 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 08:48:04.787776 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.780237 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 08:48:04.787776 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.780239 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 08:48:04.787776 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.780242 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 08:48:04.787776 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.780244 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 08:48:04.787776 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.780247 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 08:48:04.787776 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.780250 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 08:48:04.787776 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.780253 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 08:48:04.787776 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.780255 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 08:48:04.787776 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.780258 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 08:48:04.787776 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.780260 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 08:48:04.787776 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.780263 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 08:48:04.787776 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.780265 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 08:48:04.787776 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.780268 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 08:48:04.787776 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.780270 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 08:48:04.787776 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.780273 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 08:48:04.787776 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.780276 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 08:48:04.787776 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.780278 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 08:48:04.787776 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.780281 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 08:48:04.788265 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.780284 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 08:48:04.788265 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.780286 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 08:48:04.788265 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.780288 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 08:48:04.788265 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.780292 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 08:48:04.788265 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.780296 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 08:48:04.788265 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.780299 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 08:48:04.788265 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.780302 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 08:48:04.788265 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.780304 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 08:48:04.788265 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.780306 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 08:48:04.788265 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.780309 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 08:48:04.788265 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.780311 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 08:48:04.788265 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.780315 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 08:48:04.788265 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.780317 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 08:48:04.788265 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.780320 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 08:48:04.788265 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.780322 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 08:48:04.788265 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.780325 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 08:48:04.788265 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.780327 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 08:48:04.788265 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.780329 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 08:48:04.788265 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.780332 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 08:48:04.788265 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.780335 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 08:48:04.788757 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.780352 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 08:48:04.788757 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.780355 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 08:48:04.788757 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.780358 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 08:48:04.788757 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.780360 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 08:48:04.788757 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.780935 2579 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 08:48:04.788757 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.788409 2579 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 23 08:48:04.788757 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.788526 2579 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 23 08:48:04.788757 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788581 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 08:48:04.788757 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788586 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 08:48:04.788757 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788589 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 08:48:04.788757 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788593 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 08:48:04.788757 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788596 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 08:48:04.788757 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788599 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 08:48:04.788757 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788603 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 08:48:04.788757 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788607 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 08:48:04.789137 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788610 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 08:48:04.789137 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788614 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 08:48:04.789137 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788618 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 08:48:04.789137 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788622 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 08:48:04.789137 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788625 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 08:48:04.789137 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788628 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 08:48:04.789137 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788631 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 08:48:04.789137 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788633 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 08:48:04.789137 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788636 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 08:48:04.789137 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788638 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 08:48:04.789137 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788640 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 08:48:04.789137 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788643 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 08:48:04.789137 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788646 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 08:48:04.789137 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788648 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 08:48:04.789137 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788651 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 08:48:04.789137 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788653 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 08:48:04.789137 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788656 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 08:48:04.789137 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788658 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 08:48:04.789137 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788661 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 08:48:04.789137 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788663 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 08:48:04.789633 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788665 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 08:48:04.789633 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788668 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 08:48:04.789633 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788670 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 08:48:04.789633 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788674 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 08:48:04.789633 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788677 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 08:48:04.789633 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788679 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 08:48:04.789633 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788682 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 08:48:04.789633 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788684 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 08:48:04.789633 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788687 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 08:48:04.789633 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788690 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 08:48:04.789633 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788692 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 08:48:04.789633 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788694 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 08:48:04.789633 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788697 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 08:48:04.789633 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788699 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 08:48:04.789633 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788702 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 08:48:04.789633 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788704 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 08:48:04.789633 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788707 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 08:48:04.789633 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788710 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 08:48:04.789633 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788712 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 08:48:04.789633 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788714 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 08:48:04.790104 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788717 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 08:48:04.790104 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788719 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 08:48:04.790104 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788722 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 08:48:04.790104 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788724 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 08:48:04.790104 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788727 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 08:48:04.790104 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788729 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 08:48:04.790104 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788732 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 08:48:04.790104 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788734 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 08:48:04.790104 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788736 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 08:48:04.790104 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788739 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 08:48:04.790104 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788741 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 08:48:04.790104 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788744 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 08:48:04.790104 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788746 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 08:48:04.790104 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788749 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 08:48:04.790104 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788751 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 08:48:04.790104 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788754 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 08:48:04.790104 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788757 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 08:48:04.790104 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788759 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 23 08:48:04.790104 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788762 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 08:48:04.790104 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788764 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 08:48:04.790713 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788767 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 08:48:04.790713 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788770 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 08:48:04.790713 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788772 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 08:48:04.790713 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788775 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 08:48:04.790713 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788777 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 08:48:04.790713 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788780 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 08:48:04.790713 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788782 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 08:48:04.790713 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788785 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 08:48:04.790713 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788787 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 08:48:04.790713 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788790 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 08:48:04.790713 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788793 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 08:48:04.790713 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788796 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 08:48:04.790713 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788798 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 08:48:04.790713 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788800 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 08:48:04.790713 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788803 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 08:48:04.790713 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788806 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 08:48:04.790713 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788808 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 08:48:04.790713 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788811 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 08:48:04.791555 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.788816 2579 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 08:48:04.791555 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788904 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 08:48:04.791555 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788909 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 08:48:04.791555 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788912 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 08:48:04.791555 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788915 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 08:48:04.791555 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788917 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 08:48:04.791555 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788920 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 08:48:04.791555 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788923 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 08:48:04.791555 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788926 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 08:48:04.791555 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788929 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 08:48:04.791555 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788932 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 08:48:04.791555 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788935 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 08:48:04.791555 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788938 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 08:48:04.791555 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788940 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 08:48:04.791555 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788943 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 08:48:04.791555 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788946 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 08:48:04.791959 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788948 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 08:48:04.791959 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788951 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 08:48:04.791959 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788953 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 08:48:04.791959 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788956 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 08:48:04.791959 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788959 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 08:48:04.791959 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788961 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 08:48:04.791959 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788964 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 08:48:04.791959 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788966 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 08:48:04.791959 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788969 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 08:48:04.791959 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788972 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 08:48:04.791959 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788975 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 08:48:04.791959 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788977 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 08:48:04.791959 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788979 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 08:48:04.791959 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788982 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 08:48:04.791959 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788985 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 08:48:04.791959 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788988 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 08:48:04.791959 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788990 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 08:48:04.791959 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788992 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 08:48:04.791959 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788995 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 08:48:04.792485 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.788998 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 08:48:04.792485 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.789001 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 08:48:04.792485 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.789004 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 08:48:04.792485 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.789007 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 08:48:04.792485 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.789009 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 08:48:04.792485 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.789012 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 08:48:04.792485 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.789014 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 08:48:04.792485 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.789017 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 08:48:04.792485 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.789019 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 08:48:04.792485 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.789021 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 08:48:04.792485 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.789024 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 08:48:04.792485 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.789026 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 08:48:04.792485 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.789028 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 08:48:04.792485 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.789031 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 08:48:04.792485 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.789033 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 08:48:04.792485 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.789035 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 08:48:04.792485 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.789038 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 08:48:04.792485 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.789040 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 08:48:04.792485 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.789042 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 08:48:04.792983 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.789045 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 08:48:04.792983 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.789047 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 08:48:04.792983 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.789049 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 08:48:04.792983 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.789052 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 08:48:04.792983 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.789055 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 08:48:04.792983 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.789057 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 08:48:04.792983 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.789059 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 08:48:04.792983 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.789062 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 08:48:04.792983 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.789064 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 08:48:04.792983 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.789067 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 08:48:04.792983 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.789069 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 08:48:04.792983 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.789072 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 08:48:04.792983 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.789074 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 08:48:04.792983 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.789076 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 08:48:04.792983 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.789079 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 08:48:04.792983 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.789081 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 08:48:04.792983 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.789083 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 08:48:04.792983 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.789086 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 08:48:04.792983 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.789088 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 08:48:04.792983 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.789090 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 08:48:04.793473 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.789093 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 08:48:04.793473 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.789096 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 08:48:04.793473 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.789098 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 08:48:04.793473 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.789100 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 08:48:04.793473 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.789104 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 08:48:04.793473 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.789107 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 08:48:04.793473 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.789110 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 08:48:04.793473 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.789113 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 08:48:04.793473 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.789115 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 08:48:04.793473 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.789118 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 08:48:04.793473 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.789120 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 23 08:48:04.793473 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.789123 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 08:48:04.793473 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:04.789125 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 08:48:04.793473 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.789130 2579 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 08:48:04.793473 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.789740 2579 server.go:962] "Client rotation is on, will bootstrap in background" Apr 23 08:48:04.793870 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.793635 2579 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 23 08:48:04.794540 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.794500 2579 server.go:1019] "Starting client certificate rotation" Apr 23 08:48:04.794631 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.794613 2579 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 08:48:04.794687 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.794660 2579 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 08:48:04.817943 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.817913 2579 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 08:48:04.820424 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.820399 2579 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 08:48:04.834240 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.834223 2579 log.go:25] "Validated CRI v1 runtime API" Apr 23 08:48:04.840722 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.840706 2579 log.go:25] "Validated CRI v1 image API" Apr 23 08:48:04.842203 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.842175 2579 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 23 08:48:04.847039 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.847018 2579 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 b4a2b878-1dda-4fe4-90ce-d114859e4ae0:/dev/nvme0n1p4 cb85f03e-358d-4e41-9fbd-0602f53318b1:/dev/nvme0n1p3] Apr 23 08:48:04.847125 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.847038 2579 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 23 08:48:04.854647 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.854627 2579 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 08:48:04.854730 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.854572 2579 manager.go:217] Machine: {Timestamp:2026-04-23 08:48:04.853491881 +0000 UTC m=+0.382971909 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3097788 MemoryCapacity:32812175360 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec214483455e1752f871d39467b6dd85 SystemUUID:ec214483-455e-1752-f871-d39467b6dd85 BootID:197a9f83-6d98-4282-a49e-544df100c7e9 Filesystems:[{Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406089728 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:4a:a1:18:bb:d5 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:4a:a1:18:bb:d5 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:72:98:55:01:fc:2a Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812175360 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 23 08:48:04.854806 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.854733 2579 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 23 08:48:04.854925 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.854906 2579 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 23 08:48:04.856738 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.856712 2579 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 23 08:48:04.856908 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.856742 2579 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-131-47.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 23 08:48:04.856994 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.856922 2579 topology_manager.go:138] "Creating topology manager with none policy" Apr 23 08:48:04.856994 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.856936 2579 container_manager_linux.go:306] "Creating device plugin manager" Apr 23 08:48:04.856994 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.856954 2579 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 08:48:04.857631 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.857618 2579 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 08:48:04.859130 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.859117 2579 state_mem.go:36] "Initialized new in-memory state store" Apr 23 08:48:04.859264 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.859253 2579 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 23 08:48:04.861497 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.861486 2579 kubelet.go:491] "Attempting to sync node with API server" Apr 23 08:48:04.861560 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.861503 2579 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 23 08:48:04.861560 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.861519 2579 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 23 08:48:04.861560 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.861534 2579 kubelet.go:397] "Adding apiserver pod source" Apr 23 08:48:04.861560 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.861546 2579 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 23 08:48:04.862447 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.862434 2579 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 08:48:04.862525 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.862457 2579 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 08:48:04.865320 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.865301 2579 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 23 08:48:04.866680 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.866667 2579 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 23 08:48:04.868444 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.868433 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 23 08:48:04.868496 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.868450 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 23 08:48:04.868496 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.868456 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 23 08:48:04.868496 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.868462 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 23 08:48:04.868496 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.868468 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 23 08:48:04.868496 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.868474 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 23 08:48:04.868496 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.868479 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 23 08:48:04.868496 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.868485 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 23 08:48:04.868496 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.868493 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 23 08:48:04.868496 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.868500 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 23 08:48:04.868724 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.868514 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 23 08:48:04.868724 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.868523 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 23 08:48:04.869359 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.869325 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 23 08:48:04.869359 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.869351 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 23 08:48:04.873452 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.873439 2579 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 23 08:48:04.873538 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.873475 2579 server.go:1295] "Started kubelet" Apr 23 08:48:04.873593 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.873556 2579 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 23 08:48:04.873644 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.873581 2579 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 23 08:48:04.873704 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.873689 2579 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 23 08:48:04.874611 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.874591 2579 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-47.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 08:48:04.874683 ip-10-0-131-47 systemd[1]: Started Kubernetes Kubelet. Apr 23 08:48:04.874762 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:04.874709 2579 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 23 08:48:04.874795 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:04.874759 2579 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-131-47.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 23 08:48:04.875431 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.875319 2579 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 23 08:48:04.875484 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.875446 2579 server.go:317] "Adding debug handlers to kubelet server" Apr 23 08:48:04.879852 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:04.878236 2579 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-131-47.ec2.internal.18a8f022b62a795b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-131-47.ec2.internal,UID:ip-10-0-131-47.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-131-47.ec2.internal,},FirstTimestamp:2026-04-23 08:48:04.873451867 +0000 UTC m=+0.402931883,LastTimestamp:2026-04-23 08:48:04.873451867 +0000 UTC m=+0.402931883,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-47.ec2.internal,}" Apr 23 08:48:04.880722 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.880700 2579 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 23 08:48:04.880835 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.880766 2579 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-pvb8m" Apr 23 08:48:04.881140 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.881120 2579 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 23 08:48:04.881676 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.881663 2579 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 23 08:48:04.881781 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.881771 2579 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 23 08:48:04.881904 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.881886 2579 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 23 08:48:04.882003 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.881957 2579 reconstruct.go:97] "Volume reconstruction finished" Apr 23 08:48:04.882003 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.881965 2579 reconciler.go:26] "Reconciler: start to sync state" Apr 23 08:48:04.882117 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:04.882054 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-47.ec2.internal\" not found" Apr 23 08:48:04.882498 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.882433 2579 factory.go:55] Registering systemd factory Apr 23 08:48:04.882498 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.882505 2579 factory.go:223] Registration of the systemd container factory successfully Apr 23 08:48:04.882809 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.882795 2579 factory.go:153] Registering CRI-O factory Apr 23 08:48:04.882872 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.882814 2579 factory.go:223] Registration of the crio container factory successfully Apr 23 08:48:04.882872 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.882864 2579 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 23 08:48:04.882951 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.882887 2579 factory.go:103] Registering Raw factory Apr 23 08:48:04.882951 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.882902 2579 manager.go:1196] Started watching for new ooms in manager Apr 23 08:48:04.883367 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.883325 2579 manager.go:319] Starting recovery of all containers Apr 23 08:48:04.884276 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:04.884257 2579 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 23 08:48:04.885159 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:04.885133 2579 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-131-47.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 23 08:48:04.885996 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:04.885594 2579 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 23 08:48:04.885996 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.885882 2579 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-pvb8m" Apr 23 08:48:04.894699 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.894679 2579 manager.go:324] Recovery completed Apr 23 08:48:04.898912 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.898896 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 08:48:04.901610 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.901593 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-47.ec2.internal" event="NodeHasSufficientMemory" Apr 23 08:48:04.901678 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.901624 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-47.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 08:48:04.901678 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.901635 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-47.ec2.internal" event="NodeHasSufficientPID" Apr 23 08:48:04.902183 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.902169 2579 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 23 08:48:04.902183 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.902179 2579 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 23 08:48:04.902267 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.902195 2579 state_mem.go:36] "Initialized new in-memory state store" Apr 23 08:48:04.905417 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.905405 2579 policy_none.go:49] "None policy: Start" Apr 23 08:48:04.905460 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.905423 2579 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 23 08:48:04.905460 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.905433 2579 state_mem.go:35] "Initializing new in-memory state store" Apr 23 08:48:04.940917 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.940901 2579 manager.go:341] "Starting Device Plugin manager" Apr 23 08:48:04.962380 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:04.940934 2579 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 23 08:48:04.962380 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.940946 2579 server.go:85] "Starting device plugin registration server" Apr 23 08:48:04.962380 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.941204 2579 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 23 08:48:04.962380 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.941215 2579 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 23 08:48:04.962380 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.941310 2579 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 23 08:48:04.962380 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.941410 2579 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 23 08:48:04.962380 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:04.941419 2579 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 23 08:48:04.962380 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:04.941989 2579 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 23 08:48:04.962380 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:04.942021 2579 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-131-47.ec2.internal\" not found" Apr 23 08:48:05.009627 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.009588 2579 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 23 08:48:05.010963 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.010940 2579 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 23 08:48:05.011067 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.010968 2579 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 23 08:48:05.011067 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.010992 2579 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 23 08:48:05.011067 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.011003 2579 kubelet.go:2451] "Starting kubelet main sync loop" Apr 23 08:48:05.011224 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:05.011103 2579 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 23 08:48:05.013613 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.013597 2579 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 08:48:05.041824 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.041804 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 08:48:05.042758 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.042744 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-47.ec2.internal" event="NodeHasSufficientMemory" Apr 23 08:48:05.042815 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.042773 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-47.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 08:48:05.042815 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.042784 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-47.ec2.internal" event="NodeHasSufficientPID" Apr 23 08:48:05.042815 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.042806 2579 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-131-47.ec2.internal" Apr 23 08:48:05.051310 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.051270 2579 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-131-47.ec2.internal" Apr 23 08:48:05.051310 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:05.051292 2579 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-131-47.ec2.internal\": node \"ip-10-0-131-47.ec2.internal\" not found" Apr 23 08:48:05.070391 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:05.070373 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-47.ec2.internal\" not found" Apr 23 08:48:05.111246 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.111207 2579 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-47.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-131-47.ec2.internal"] Apr 23 08:48:05.111442 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.111303 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 08:48:05.119000 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.118983 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-47.ec2.internal" event="NodeHasSufficientMemory" Apr 23 08:48:05.119072 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.119031 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-47.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 08:48:05.119072 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.119041 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-47.ec2.internal" event="NodeHasSufficientPID" Apr 23 08:48:05.121314 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.121299 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 08:48:05.121464 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.121448 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-47.ec2.internal" Apr 23 08:48:05.121513 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.121479 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 08:48:05.122078 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.122060 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-47.ec2.internal" event="NodeHasSufficientMemory" Apr 23 08:48:05.122133 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.122089 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-47.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 08:48:05.122133 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.122090 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-47.ec2.internal" event="NodeHasSufficientMemory" Apr 23 08:48:05.122133 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.122099 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-47.ec2.internal" event="NodeHasSufficientPID" Apr 23 08:48:05.122133 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.122112 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-47.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 08:48:05.122133 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.122122 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-47.ec2.internal" event="NodeHasSufficientPID" Apr 23 08:48:05.124363 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.124333 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-47.ec2.internal" Apr 23 08:48:05.124418 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.124379 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 08:48:05.125253 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.125234 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-47.ec2.internal" event="NodeHasSufficientMemory" Apr 23 08:48:05.125327 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.125266 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-47.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 08:48:05.125327 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.125294 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-47.ec2.internal" event="NodeHasSufficientPID" Apr 23 08:48:05.137856 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:05.137831 2579 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-131-47.ec2.internal\" not found" node="ip-10-0-131-47.ec2.internal" Apr 23 08:48:05.141534 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:05.141519 2579 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-131-47.ec2.internal\" not found" node="ip-10-0-131-47.ec2.internal" Apr 23 08:48:05.170730 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:05.170710 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-47.ec2.internal\" not found" Apr 23 08:48:05.184346 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.184321 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/438108d93a37ab59c6a0c9e57eee327c-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-47.ec2.internal\" (UID: \"438108d93a37ab59c6a0c9e57eee327c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-47.ec2.internal" Apr 23 08:48:05.184402 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.184366 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/438108d93a37ab59c6a0c9e57eee327c-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-47.ec2.internal\" (UID: \"438108d93a37ab59c6a0c9e57eee327c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-47.ec2.internal" Apr 23 08:48:05.184402 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.184382 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/8b712f12e316b1a6ded9d349ca82d37a-config\") pod \"kube-apiserver-proxy-ip-10-0-131-47.ec2.internal\" (UID: \"8b712f12e316b1a6ded9d349ca82d37a\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-47.ec2.internal" Apr 23 08:48:05.271264 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:05.271233 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-47.ec2.internal\" not found" Apr 23 08:48:05.284617 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.284595 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/8b712f12e316b1a6ded9d349ca82d37a-config\") pod \"kube-apiserver-proxy-ip-10-0-131-47.ec2.internal\" (UID: \"8b712f12e316b1a6ded9d349ca82d37a\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-47.ec2.internal" Apr 23 08:48:05.284684 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.284627 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/438108d93a37ab59c6a0c9e57eee327c-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-47.ec2.internal\" (UID: \"438108d93a37ab59c6a0c9e57eee327c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-47.ec2.internal" Apr 23 08:48:05.284684 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.284645 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/438108d93a37ab59c6a0c9e57eee327c-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-47.ec2.internal\" (UID: \"438108d93a37ab59c6a0c9e57eee327c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-47.ec2.internal" Apr 23 08:48:05.284684 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.284671 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/438108d93a37ab59c6a0c9e57eee327c-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-47.ec2.internal\" (UID: \"438108d93a37ab59c6a0c9e57eee327c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-47.ec2.internal" Apr 23 08:48:05.284771 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.284694 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/8b712f12e316b1a6ded9d349ca82d37a-config\") pod \"kube-apiserver-proxy-ip-10-0-131-47.ec2.internal\" (UID: \"8b712f12e316b1a6ded9d349ca82d37a\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-47.ec2.internal" Apr 23 08:48:05.284771 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.284717 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/438108d93a37ab59c6a0c9e57eee327c-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-47.ec2.internal\" (UID: \"438108d93a37ab59c6a0c9e57eee327c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-47.ec2.internal" Apr 23 08:48:05.372099 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:05.372021 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-47.ec2.internal\" not found" Apr 23 08:48:05.439540 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.439503 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-47.ec2.internal" Apr 23 08:48:05.444529 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.444504 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-47.ec2.internal" Apr 23 08:48:05.473038 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:05.473011 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-47.ec2.internal\" not found" Apr 23 08:48:05.573568 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:05.573531 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-47.ec2.internal\" not found" Apr 23 08:48:05.674121 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:05.674091 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-47.ec2.internal\" not found" Apr 23 08:48:05.688210 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.688189 2579 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 08:48:05.782404 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.782377 2579 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-47.ec2.internal" Apr 23 08:48:05.791689 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.791667 2579 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 08:48:05.793278 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.793263 2579 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-47.ec2.internal" Apr 23 08:48:05.794806 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.794791 2579 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 23 08:48:05.794913 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.794897 2579 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 23 08:48:05.794959 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.794943 2579 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 23 08:48:05.794993 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:05.794957 2579 kubelet.go:3342] "Failed creating a mirror pod" err="Post \"https://a06a4564594a745949da182889c168e1-ee8d086a8de6f335.elb.us-east-1.amazonaws.com:6443/api/v1/namespaces/openshift-machine-config-operator/pods\": read tcp 10.0.131.47:57168->52.205.186.151:6443: use of closed network connection" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-47.ec2.internal" Apr 23 08:48:05.862139 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.862102 2579 apiserver.go:52] "Watching apiserver" Apr 23 08:48:05.878195 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.878161 2579 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 23 08:48:05.878508 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.878486 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-c68x8","openshift-multus/multus-additional-cni-plugins-jwnqb","openshift-multus/network-metrics-daemon-96m6d","openshift-network-diagnostics/network-check-target-4mjb6","openshift-network-operator/iptables-alerter-zf6q5","kube-system/konnectivity-agent-6hwnm","openshift-image-registry/node-ca-z6xdx","openshift-multus/multus-h2vsr","openshift-ovn-kubernetes/ovnkube-node-9bsnx","kube-system/kube-apiserver-proxy-ip-10-0-131-47.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jx26p","openshift-cluster-node-tuning-operator/tuned-9jfcj"] Apr 23 08:48:05.880891 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.880873 2579 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 23 08:48:05.881237 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.881218 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-c68x8" Apr 23 08:48:05.882919 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.882897 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-tnkm9\"" Apr 23 08:48:05.883027 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.883009 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 23 08:48:05.883089 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.883035 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 23 08:48:05.886304 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.886265 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-jwnqb" Apr 23 08:48:05.888662 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.888560 2579 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-22 08:43:04 +0000 UTC" deadline="2028-01-06 20:02:14.051458558 +0000 UTC" Apr 23 08:48:05.888662 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.888593 2579 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14963h14m8.162869475s" Apr 23 08:48:05.888662 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.888642 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/6d76e4ef-b56a-40fd-9c37-6eb55602fea4-hosts-file\") pod \"node-resolver-c68x8\" (UID: \"6d76e4ef-b56a-40fd-9c37-6eb55602fea4\") " pod="openshift-dns/node-resolver-c68x8" Apr 23 08:48:05.888853 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.888676 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6d76e4ef-b56a-40fd-9c37-6eb55602fea4-tmp-dir\") pod \"node-resolver-c68x8\" (UID: \"6d76e4ef-b56a-40fd-9c37-6eb55602fea4\") " pod="openshift-dns/node-resolver-c68x8" Apr 23 08:48:05.888853 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.888708 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8e6d8633-b9de-4e37-96e7-465a09675f90-system-cni-dir\") pod \"multus-additional-cni-plugins-jwnqb\" (UID: \"8e6d8633-b9de-4e37-96e7-465a09675f90\") " pod="openshift-multus/multus-additional-cni-plugins-jwnqb" Apr 23 08:48:05.888853 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.888736 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8e6d8633-b9de-4e37-96e7-465a09675f90-cni-binary-copy\") pod \"multus-additional-cni-plugins-jwnqb\" (UID: \"8e6d8633-b9de-4e37-96e7-465a09675f90\") " pod="openshift-multus/multus-additional-cni-plugins-jwnqb" Apr 23 08:48:05.888853 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.888754 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 23 08:48:05.888853 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.888760 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8e6d8633-b9de-4e37-96e7-465a09675f90-tuning-conf-dir\") pod \"multus-additional-cni-plugins-jwnqb\" (UID: \"8e6d8633-b9de-4e37-96e7-465a09675f90\") " pod="openshift-multus/multus-additional-cni-plugins-jwnqb" Apr 23 08:48:05.888853 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.888786 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/8e6d8633-b9de-4e37-96e7-465a09675f90-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-jwnqb\" (UID: \"8e6d8633-b9de-4e37-96e7-465a09675f90\") " pod="openshift-multus/multus-additional-cni-plugins-jwnqb" Apr 23 08:48:05.888853 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.888814 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gclgm\" (UniqueName: \"kubernetes.io/projected/6d76e4ef-b56a-40fd-9c37-6eb55602fea4-kube-api-access-gclgm\") pod \"node-resolver-c68x8\" (UID: \"6d76e4ef-b56a-40fd-9c37-6eb55602fea4\") " pod="openshift-dns/node-resolver-c68x8" Apr 23 08:48:05.888853 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.888847 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8e6d8633-b9de-4e37-96e7-465a09675f90-cnibin\") pod \"multus-additional-cni-plugins-jwnqb\" (UID: \"8e6d8633-b9de-4e37-96e7-465a09675f90\") " pod="openshift-multus/multus-additional-cni-plugins-jwnqb" Apr 23 08:48:05.889214 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.888871 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8e6d8633-b9de-4e37-96e7-465a09675f90-os-release\") pod \"multus-additional-cni-plugins-jwnqb\" (UID: \"8e6d8633-b9de-4e37-96e7-465a09675f90\") " pod="openshift-multus/multus-additional-cni-plugins-jwnqb" Apr 23 08:48:05.889214 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.888880 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-fqh4q\"" Apr 23 08:48:05.889214 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.888902 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/8e6d8633-b9de-4e37-96e7-465a09675f90-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-jwnqb\" (UID: \"8e6d8633-b9de-4e37-96e7-465a09675f90\") " pod="openshift-multus/multus-additional-cni-plugins-jwnqb" Apr 23 08:48:05.889214 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.888930 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcfg5\" (UniqueName: \"kubernetes.io/projected/8e6d8633-b9de-4e37-96e7-465a09675f90-kube-api-access-zcfg5\") pod \"multus-additional-cni-plugins-jwnqb\" (UID: \"8e6d8633-b9de-4e37-96e7-465a09675f90\") " pod="openshift-multus/multus-additional-cni-plugins-jwnqb" Apr 23 08:48:05.889214 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.888762 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 23 08:48:05.889214 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.889067 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 23 08:48:05.889214 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.889124 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 23 08:48:05.889214 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.889192 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 23 08:48:05.889645 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.889630 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-96m6d" Apr 23 08:48:05.889729 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:05.889705 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-96m6d" podUID="f3e78a7f-f7cc-48d6-a29a-7d418c195aef" Apr 23 08:48:05.889809 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.889728 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4mjb6" Apr 23 08:48:05.889809 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:05.889794 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4mjb6" podUID="bb54c9e2-2568-441d-a59a-ffa007afefd6" Apr 23 08:48:05.891030 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.891011 2579 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 08:48:05.891772 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.891754 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-zf6q5" Apr 23 08:48:05.893249 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.893235 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 23 08:48:05.893479 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.893465 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 23 08:48:05.893688 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.893666 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 23 08:48:05.893863 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.893850 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-5hdm4\"" Apr 23 08:48:05.893930 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.893849 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-6hwnm" Apr 23 08:48:05.895637 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.895611 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 23 08:48:05.895786 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.895769 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-n2m64\"" Apr 23 08:48:05.895851 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.895804 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 23 08:48:05.896070 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.896055 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-z6xdx" Apr 23 08:48:05.897473 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.897454 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 23 08:48:05.897830 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.897814 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 23 08:48:05.897971 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.897955 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-5lxjk\"" Apr 23 08:48:05.898040 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.897955 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 23 08:48:05.898217 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.898204 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-h2vsr" Apr 23 08:48:05.899760 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.899744 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 23 08:48:05.899846 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.899801 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-j9zjj\"" Apr 23 08:48:05.901084 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.900740 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-9bsnx" Apr 23 08:48:05.903267 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.903207 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jx26p" Apr 23 08:48:05.903593 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.903555 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 23 08:48:05.903695 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.903591 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-b8zhm\"" Apr 23 08:48:05.903695 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.903653 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 23 08:48:05.903801 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.903733 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 23 08:48:05.903851 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.903827 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 23 08:48:05.903966 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.903947 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 23 08:48:05.904144 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.904126 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 23 08:48:05.904821 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.904804 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 23 08:48:05.905057 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.905044 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 23 08:48:05.905171 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.905156 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-77bxt\"" Apr 23 08:48:05.905571 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.905554 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 23 08:48:05.905961 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.905942 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-9jfcj" Apr 23 08:48:05.907573 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.907556 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-jc9lv\"" Apr 23 08:48:05.907642 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.907578 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 23 08:48:05.907642 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.907588 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 23 08:48:05.914228 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.914065 2579 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-rsrzd" Apr 23 08:48:05.919235 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.919220 2579 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-rsrzd" Apr 23 08:48:05.983408 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.983387 2579 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 23 08:48:05.989144 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.989120 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/8e6d8633-b9de-4e37-96e7-465a09675f90-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-jwnqb\" (UID: \"8e6d8633-b9de-4e37-96e7-465a09675f90\") " pod="openshift-multus/multus-additional-cni-plugins-jwnqb" Apr 23 08:48:05.989254 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.989151 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/33cb4bff-3a7e-42a3-8d3d-e1c79d36437b-run-systemd\") pod \"ovnkube-node-9bsnx\" (UID: \"33cb4bff-3a7e-42a3-8d3d-e1c79d36437b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bsnx" Apr 23 08:48:05.989254 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.989170 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/33cb4bff-3a7e-42a3-8d3d-e1c79d36437b-host-cni-bin\") pod \"ovnkube-node-9bsnx\" (UID: \"33cb4bff-3a7e-42a3-8d3d-e1c79d36437b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bsnx" Apr 23 08:48:05.989254 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.989185 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/83905907-ce4f-4b05-a10c-1a78044f595e-iptables-alerter-script\") pod \"iptables-alerter-zf6q5\" (UID: \"83905907-ce4f-4b05-a10c-1a78044f595e\") " pod="openshift-network-operator/iptables-alerter-zf6q5" Apr 23 08:48:05.989254 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.989209 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/52011e07-e36d-48b0-bed0-421685c0e544-agent-certs\") pod \"konnectivity-agent-6hwnm\" (UID: \"52011e07-e36d-48b0-bed0-421685c0e544\") " pod="kube-system/konnectivity-agent-6hwnm" Apr 23 08:48:05.989254 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.989230 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/0991960b-d8b1-454f-a21e-8de493704ad2-etc-sysctl-d\") pod \"tuned-9jfcj\" (UID: \"0991960b-d8b1-454f-a21e-8de493704ad2\") " pod="openshift-cluster-node-tuning-operator/tuned-9jfcj" Apr 23 08:48:05.989532 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.989300 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/0991960b-d8b1-454f-a21e-8de493704ad2-etc-systemd\") pod \"tuned-9jfcj\" (UID: \"0991960b-d8b1-454f-a21e-8de493704ad2\") " pod="openshift-cluster-node-tuning-operator/tuned-9jfcj" Apr 23 08:48:05.989532 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.989361 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0991960b-d8b1-454f-a21e-8de493704ad2-lib-modules\") pod \"tuned-9jfcj\" (UID: \"0991960b-d8b1-454f-a21e-8de493704ad2\") " pod="openshift-cluster-node-tuning-operator/tuned-9jfcj" Apr 23 08:48:05.989532 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.989390 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a0aeb874-3a11-4639-a579-36512ba94069-host\") pod \"node-ca-z6xdx\" (UID: \"a0aeb874-3a11-4639-a579-36512ba94069\") " pod="openshift-image-registry/node-ca-z6xdx" Apr 23 08:48:05.989532 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.989417 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/33cb4bff-3a7e-42a3-8d3d-e1c79d36437b-ovn-node-metrics-cert\") pod \"ovnkube-node-9bsnx\" (UID: \"33cb4bff-3a7e-42a3-8d3d-e1c79d36437b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bsnx" Apr 23 08:48:05.989532 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.989455 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6d76e4ef-b56a-40fd-9c37-6eb55602fea4-tmp-dir\") pod \"node-resolver-c68x8\" (UID: \"6d76e4ef-b56a-40fd-9c37-6eb55602fea4\") " pod="openshift-dns/node-resolver-c68x8" Apr 23 08:48:05.989532 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.989480 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/33cb4bff-3a7e-42a3-8d3d-e1c79d36437b-host-run-netns\") pod \"ovnkube-node-9bsnx\" (UID: \"33cb4bff-3a7e-42a3-8d3d-e1c79d36437b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bsnx" Apr 23 08:48:05.989532 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.989509 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/8e6d8633-b9de-4e37-96e7-465a09675f90-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-jwnqb\" (UID: \"8e6d8633-b9de-4e37-96e7-465a09675f90\") " pod="openshift-multus/multus-additional-cni-plugins-jwnqb" Apr 23 08:48:05.989858 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.989537 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/33cb4bff-3a7e-42a3-8d3d-e1c79d36437b-ovnkube-script-lib\") pod \"ovnkube-node-9bsnx\" (UID: \"33cb4bff-3a7e-42a3-8d3d-e1c79d36437b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bsnx" Apr 23 08:48:05.989858 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.989562 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/4b89f18f-6e25-4059-988c-27d5c1a39867-host-run-k8s-cni-cncf-io\") pod \"multus-h2vsr\" (UID: \"4b89f18f-6e25-4059-988c-27d5c1a39867\") " pod="openshift-multus/multus-h2vsr" Apr 23 08:48:05.989858 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.989587 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/4b89f18f-6e25-4059-988c-27d5c1a39867-host-run-multus-certs\") pod \"multus-h2vsr\" (UID: \"4b89f18f-6e25-4059-988c-27d5c1a39867\") " pod="openshift-multus/multus-h2vsr" Apr 23 08:48:05.989858 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.989608 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0991960b-d8b1-454f-a21e-8de493704ad2-sys\") pod \"tuned-9jfcj\" (UID: \"0991960b-d8b1-454f-a21e-8de493704ad2\") " pod="openshift-cluster-node-tuning-operator/tuned-9jfcj" Apr 23 08:48:05.989858 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.989634 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsd6b\" (UniqueName: \"kubernetes.io/projected/0991960b-d8b1-454f-a21e-8de493704ad2-kube-api-access-dsd6b\") pod \"tuned-9jfcj\" (UID: \"0991960b-d8b1-454f-a21e-8de493704ad2\") " pod="openshift-cluster-node-tuning-operator/tuned-9jfcj" Apr 23 08:48:05.989858 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.989658 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/6d76e4ef-b56a-40fd-9c37-6eb55602fea4-hosts-file\") pod \"node-resolver-c68x8\" (UID: \"6d76e4ef-b56a-40fd-9c37-6eb55602fea4\") " pod="openshift-dns/node-resolver-c68x8" Apr 23 08:48:05.989858 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.989683 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/33cb4bff-3a7e-42a3-8d3d-e1c79d36437b-var-lib-openvswitch\") pod \"ovnkube-node-9bsnx\" (UID: \"33cb4bff-3a7e-42a3-8d3d-e1c79d36437b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bsnx" Apr 23 08:48:05.989858 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.989707 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/33cb4bff-3a7e-42a3-8d3d-e1c79d36437b-run-openvswitch\") pod \"ovnkube-node-9bsnx\" (UID: \"33cb4bff-3a7e-42a3-8d3d-e1c79d36437b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bsnx" Apr 23 08:48:05.989858 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.989732 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/83905907-ce4f-4b05-a10c-1a78044f595e-host-slash\") pod \"iptables-alerter-zf6q5\" (UID: \"83905907-ce4f-4b05-a10c-1a78044f595e\") " pod="openshift-network-operator/iptables-alerter-zf6q5" Apr 23 08:48:05.989858 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.989744 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6d76e4ef-b56a-40fd-9c37-6eb55602fea4-tmp-dir\") pod \"node-resolver-c68x8\" (UID: \"6d76e4ef-b56a-40fd-9c37-6eb55602fea4\") " pod="openshift-dns/node-resolver-c68x8" Apr 23 08:48:05.989858 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.989754 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/8963f0bc-78e1-4f93-a9a9-bba51f04c437-device-dir\") pod \"aws-ebs-csi-driver-node-jx26p\" (UID: \"8963f0bc-78e1-4f93-a9a9-bba51f04c437\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jx26p" Apr 23 08:48:05.989858 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.989821 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/0991960b-d8b1-454f-a21e-8de493704ad2-etc-modprobe-d\") pod \"tuned-9jfcj\" (UID: \"0991960b-d8b1-454f-a21e-8de493704ad2\") " pod="openshift-cluster-node-tuning-operator/tuned-9jfcj" Apr 23 08:48:05.989858 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.989847 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/33cb4bff-3a7e-42a3-8d3d-e1c79d36437b-node-log\") pod \"ovnkube-node-9bsnx\" (UID: \"33cb4bff-3a7e-42a3-8d3d-e1c79d36437b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bsnx" Apr 23 08:48:05.990466 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.989870 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/52011e07-e36d-48b0-bed0-421685c0e544-konnectivity-ca\") pod \"konnectivity-agent-6hwnm\" (UID: \"52011e07-e36d-48b0-bed0-421685c0e544\") " pod="kube-system/konnectivity-agent-6hwnm" Apr 23 08:48:05.990466 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.989893 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4b89f18f-6e25-4059-988c-27d5c1a39867-system-cni-dir\") pod \"multus-h2vsr\" (UID: \"4b89f18f-6e25-4059-988c-27d5c1a39867\") " pod="openshift-multus/multus-h2vsr" Apr 23 08:48:05.990466 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.989916 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/4b89f18f-6e25-4059-988c-27d5c1a39867-hostroot\") pod \"multus-h2vsr\" (UID: \"4b89f18f-6e25-4059-988c-27d5c1a39867\") " pod="openshift-multus/multus-h2vsr" Apr 23 08:48:05.990466 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.989936 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/8e6d8633-b9de-4e37-96e7-465a09675f90-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-jwnqb\" (UID: \"8e6d8633-b9de-4e37-96e7-465a09675f90\") " pod="openshift-multus/multus-additional-cni-plugins-jwnqb" Apr 23 08:48:05.990466 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.989946 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lshnm\" (UniqueName: \"kubernetes.io/projected/a0aeb874-3a11-4639-a579-36512ba94069-kube-api-access-lshnm\") pod \"node-ca-z6xdx\" (UID: \"a0aeb874-3a11-4639-a579-36512ba94069\") " pod="openshift-image-registry/node-ca-z6xdx" Apr 23 08:48:05.990466 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.989968 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/33cb4bff-3a7e-42a3-8d3d-e1c79d36437b-host-cni-netd\") pod \"ovnkube-node-9bsnx\" (UID: \"33cb4bff-3a7e-42a3-8d3d-e1c79d36437b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bsnx" Apr 23 08:48:05.990466 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.989971 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/8e6d8633-b9de-4e37-96e7-465a09675f90-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-jwnqb\" (UID: \"8e6d8633-b9de-4e37-96e7-465a09675f90\") " pod="openshift-multus/multus-additional-cni-plugins-jwnqb" Apr 23 08:48:05.990466 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.990008 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0991960b-d8b1-454f-a21e-8de493704ad2-var-lib-kubelet\") pod \"tuned-9jfcj\" (UID: \"0991960b-d8b1-454f-a21e-8de493704ad2\") " pod="openshift-cluster-node-tuning-operator/tuned-9jfcj" Apr 23 08:48:05.990466 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.990055 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/6d76e4ef-b56a-40fd-9c37-6eb55602fea4-hosts-file\") pod \"node-resolver-c68x8\" (UID: \"6d76e4ef-b56a-40fd-9c37-6eb55602fea4\") " pod="openshift-dns/node-resolver-c68x8" Apr 23 08:48:05.990466 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.990081 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8e6d8633-b9de-4e37-96e7-465a09675f90-os-release\") pod \"multus-additional-cni-plugins-jwnqb\" (UID: \"8e6d8633-b9de-4e37-96e7-465a09675f90\") " pod="openshift-multus/multus-additional-cni-plugins-jwnqb" Apr 23 08:48:05.990466 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.990137 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f3e78a7f-f7cc-48d6-a29a-7d418c195aef-metrics-certs\") pod \"network-metrics-daemon-96m6d\" (UID: \"f3e78a7f-f7cc-48d6-a29a-7d418c195aef\") " pod="openshift-multus/network-metrics-daemon-96m6d" Apr 23 08:48:05.990466 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.990204 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8e6d8633-b9de-4e37-96e7-465a09675f90-os-release\") pod \"multus-additional-cni-plugins-jwnqb\" (UID: \"8e6d8633-b9de-4e37-96e7-465a09675f90\") " pod="openshift-multus/multus-additional-cni-plugins-jwnqb" Apr 23 08:48:05.990466 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.990207 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2sqx\" (UniqueName: \"kubernetes.io/projected/f3e78a7f-f7cc-48d6-a29a-7d418c195aef-kube-api-access-p2sqx\") pod \"network-metrics-daemon-96m6d\" (UID: \"f3e78a7f-f7cc-48d6-a29a-7d418c195aef\") " pod="openshift-multus/network-metrics-daemon-96m6d" Apr 23 08:48:05.990466 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.990235 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfrd2\" (UniqueName: \"kubernetes.io/projected/bb54c9e2-2568-441d-a59a-ffa007afefd6-kube-api-access-qfrd2\") pod \"network-check-target-4mjb6\" (UID: \"bb54c9e2-2568-441d-a59a-ffa007afefd6\") " pod="openshift-network-diagnostics/network-check-target-4mjb6" Apr 23 08:48:05.990466 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.990256 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/33cb4bff-3a7e-42a3-8d3d-e1c79d36437b-run-ovn\") pod \"ovnkube-node-9bsnx\" (UID: \"33cb4bff-3a7e-42a3-8d3d-e1c79d36437b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bsnx" Apr 23 08:48:05.990466 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.990270 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4b89f18f-6e25-4059-988c-27d5c1a39867-host-var-lib-cni-bin\") pod \"multus-h2vsr\" (UID: \"4b89f18f-6e25-4059-988c-27d5c1a39867\") " pod="openshift-multus/multus-h2vsr" Apr 23 08:48:05.991121 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.990285 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8963f0bc-78e1-4f93-a9a9-bba51f04c437-kubelet-dir\") pod \"aws-ebs-csi-driver-node-jx26p\" (UID: \"8963f0bc-78e1-4f93-a9a9-bba51f04c437\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jx26p" Apr 23 08:48:05.991121 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.990299 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/0991960b-d8b1-454f-a21e-8de493704ad2-etc-tuned\") pod \"tuned-9jfcj\" (UID: \"0991960b-d8b1-454f-a21e-8de493704ad2\") " pod="openshift-cluster-node-tuning-operator/tuned-9jfcj" Apr 23 08:48:05.991121 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.990313 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/33cb4bff-3a7e-42a3-8d3d-e1c79d36437b-ovnkube-config\") pod \"ovnkube-node-9bsnx\" (UID: \"33cb4bff-3a7e-42a3-8d3d-e1c79d36437b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bsnx" Apr 23 08:48:05.991121 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.990331 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4b89f18f-6e25-4059-988c-27d5c1a39867-host-run-netns\") pod \"multus-h2vsr\" (UID: \"4b89f18f-6e25-4059-988c-27d5c1a39867\") " pod="openshift-multus/multus-h2vsr" Apr 23 08:48:05.991121 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.990375 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4b89f18f-6e25-4059-988c-27d5c1a39867-multus-conf-dir\") pod \"multus-h2vsr\" (UID: \"4b89f18f-6e25-4059-988c-27d5c1a39867\") " pod="openshift-multus/multus-h2vsr" Apr 23 08:48:05.991121 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.990419 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0991960b-d8b1-454f-a21e-8de493704ad2-etc-kubernetes\") pod \"tuned-9jfcj\" (UID: \"0991960b-d8b1-454f-a21e-8de493704ad2\") " pod="openshift-cluster-node-tuning-operator/tuned-9jfcj" Apr 23 08:48:05.991121 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.990445 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gclgm\" (UniqueName: \"kubernetes.io/projected/6d76e4ef-b56a-40fd-9c37-6eb55602fea4-kube-api-access-gclgm\") pod \"node-resolver-c68x8\" (UID: \"6d76e4ef-b56a-40fd-9c37-6eb55602fea4\") " pod="openshift-dns/node-resolver-c68x8" Apr 23 08:48:05.991121 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.990480 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/33cb4bff-3a7e-42a3-8d3d-e1c79d36437b-host-kubelet\") pod \"ovnkube-node-9bsnx\" (UID: \"33cb4bff-3a7e-42a3-8d3d-e1c79d36437b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bsnx" Apr 23 08:48:05.991121 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.990494 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/33cb4bff-3a7e-42a3-8d3d-e1c79d36437b-host-slash\") pod \"ovnkube-node-9bsnx\" (UID: \"33cb4bff-3a7e-42a3-8d3d-e1c79d36437b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bsnx" Apr 23 08:48:05.991121 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.990514 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvhb9\" (UniqueName: \"kubernetes.io/projected/33cb4bff-3a7e-42a3-8d3d-e1c79d36437b-kube-api-access-cvhb9\") pod \"ovnkube-node-9bsnx\" (UID: \"33cb4bff-3a7e-42a3-8d3d-e1c79d36437b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bsnx" Apr 23 08:48:05.991121 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.990538 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4b89f18f-6e25-4059-988c-27d5c1a39867-cni-binary-copy\") pod \"multus-h2vsr\" (UID: \"4b89f18f-6e25-4059-988c-27d5c1a39867\") " pod="openshift-multus/multus-h2vsr" Apr 23 08:48:05.991121 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.990553 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/4b89f18f-6e25-4059-988c-27d5c1a39867-multus-socket-dir-parent\") pod \"multus-h2vsr\" (UID: \"4b89f18f-6e25-4059-988c-27d5c1a39867\") " pod="openshift-multus/multus-h2vsr" Apr 23 08:48:05.991121 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.990567 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0991960b-d8b1-454f-a21e-8de493704ad2-run\") pod \"tuned-9jfcj\" (UID: \"0991960b-d8b1-454f-a21e-8de493704ad2\") " pod="openshift-cluster-node-tuning-operator/tuned-9jfcj" Apr 23 08:48:05.991121 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.990587 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8e6d8633-b9de-4e37-96e7-465a09675f90-cni-binary-copy\") pod \"multus-additional-cni-plugins-jwnqb\" (UID: \"8e6d8633-b9de-4e37-96e7-465a09675f90\") " pod="openshift-multus/multus-additional-cni-plugins-jwnqb" Apr 23 08:48:05.991121 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.990613 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4b89f18f-6e25-4059-988c-27d5c1a39867-cnibin\") pod \"multus-h2vsr\" (UID: \"4b89f18f-6e25-4059-988c-27d5c1a39867\") " pod="openshift-multus/multus-h2vsr" Apr 23 08:48:05.991121 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.990661 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/4b89f18f-6e25-4059-988c-27d5c1a39867-host-var-lib-cni-multus\") pod \"multus-h2vsr\" (UID: \"4b89f18f-6e25-4059-988c-27d5c1a39867\") " pod="openshift-multus/multus-h2vsr" Apr 23 08:48:05.991121 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.990691 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4b89f18f-6e25-4059-988c-27d5c1a39867-multus-daemon-config\") pod \"multus-h2vsr\" (UID: \"4b89f18f-6e25-4059-988c-27d5c1a39867\") " pod="openshift-multus/multus-h2vsr" Apr 23 08:48:05.991971 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.990717 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/33cb4bff-3a7e-42a3-8d3d-e1c79d36437b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9bsnx\" (UID: \"33cb4bff-3a7e-42a3-8d3d-e1c79d36437b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bsnx" Apr 23 08:48:05.991971 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.990740 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkwpr\" (UniqueName: \"kubernetes.io/projected/83905907-ce4f-4b05-a10c-1a78044f595e-kube-api-access-bkwpr\") pod \"iptables-alerter-zf6q5\" (UID: \"83905907-ce4f-4b05-a10c-1a78044f595e\") " pod="openshift-network-operator/iptables-alerter-zf6q5" Apr 23 08:48:05.991971 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.990757 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8963f0bc-78e1-4f93-a9a9-bba51f04c437-registration-dir\") pod \"aws-ebs-csi-driver-node-jx26p\" (UID: \"8963f0bc-78e1-4f93-a9a9-bba51f04c437\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jx26p" Apr 23 08:48:05.991971 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.990782 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0991960b-d8b1-454f-a21e-8de493704ad2-tmp\") pod \"tuned-9jfcj\" (UID: \"0991960b-d8b1-454f-a21e-8de493704ad2\") " pod="openshift-cluster-node-tuning-operator/tuned-9jfcj" Apr 23 08:48:05.991971 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.990812 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8e6d8633-b9de-4e37-96e7-465a09675f90-tuning-conf-dir\") pod \"multus-additional-cni-plugins-jwnqb\" (UID: \"8e6d8633-b9de-4e37-96e7-465a09675f90\") " pod="openshift-multus/multus-additional-cni-plugins-jwnqb" Apr 23 08:48:05.991971 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.990839 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4b89f18f-6e25-4059-988c-27d5c1a39867-host-var-lib-kubelet\") pod \"multus-h2vsr\" (UID: \"4b89f18f-6e25-4059-988c-27d5c1a39867\") " pod="openshift-multus/multus-h2vsr" Apr 23 08:48:05.991971 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.990874 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/33cb4bff-3a7e-42a3-8d3d-e1c79d36437b-etc-openvswitch\") pod \"ovnkube-node-9bsnx\" (UID: \"33cb4bff-3a7e-42a3-8d3d-e1c79d36437b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bsnx" Apr 23 08:48:05.991971 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.990901 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8e6d8633-b9de-4e37-96e7-465a09675f90-cnibin\") pod \"multus-additional-cni-plugins-jwnqb\" (UID: \"8e6d8633-b9de-4e37-96e7-465a09675f90\") " pod="openshift-multus/multus-additional-cni-plugins-jwnqb" Apr 23 08:48:05.991971 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.990926 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zcfg5\" (UniqueName: \"kubernetes.io/projected/8e6d8633-b9de-4e37-96e7-465a09675f90-kube-api-access-zcfg5\") pod \"multus-additional-cni-plugins-jwnqb\" (UID: \"8e6d8633-b9de-4e37-96e7-465a09675f90\") " pod="openshift-multus/multus-additional-cni-plugins-jwnqb" Apr 23 08:48:05.991971 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.990967 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8e6d8633-b9de-4e37-96e7-465a09675f90-tuning-conf-dir\") pod \"multus-additional-cni-plugins-jwnqb\" (UID: \"8e6d8633-b9de-4e37-96e7-465a09675f90\") " pod="openshift-multus/multus-additional-cni-plugins-jwnqb" Apr 23 08:48:05.991971 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.990973 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/33cb4bff-3a7e-42a3-8d3d-e1c79d36437b-systemd-units\") pod \"ovnkube-node-9bsnx\" (UID: \"33cb4bff-3a7e-42a3-8d3d-e1c79d36437b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bsnx" Apr 23 08:48:05.991971 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.991027 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/33cb4bff-3a7e-42a3-8d3d-e1c79d36437b-log-socket\") pod \"ovnkube-node-9bsnx\" (UID: \"33cb4bff-3a7e-42a3-8d3d-e1c79d36437b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bsnx" Apr 23 08:48:05.991971 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.991054 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8e6d8633-b9de-4e37-96e7-465a09675f90-cni-binary-copy\") pod \"multus-additional-cni-plugins-jwnqb\" (UID: \"8e6d8633-b9de-4e37-96e7-465a09675f90\") " pod="openshift-multus/multus-additional-cni-plugins-jwnqb" Apr 23 08:48:05.991971 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.991060 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/33cb4bff-3a7e-42a3-8d3d-e1c79d36437b-host-run-ovn-kubernetes\") pod \"ovnkube-node-9bsnx\" (UID: \"33cb4bff-3a7e-42a3-8d3d-e1c79d36437b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bsnx" Apr 23 08:48:05.991971 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.991064 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8e6d8633-b9de-4e37-96e7-465a09675f90-cnibin\") pod \"multus-additional-cni-plugins-jwnqb\" (UID: \"8e6d8633-b9de-4e37-96e7-465a09675f90\") " pod="openshift-multus/multus-additional-cni-plugins-jwnqb" Apr 23 08:48:05.991971 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.991099 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/33cb4bff-3a7e-42a3-8d3d-e1c79d36437b-env-overrides\") pod \"ovnkube-node-9bsnx\" (UID: \"33cb4bff-3a7e-42a3-8d3d-e1c79d36437b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bsnx" Apr 23 08:48:05.992702 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.991131 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8e6d8633-b9de-4e37-96e7-465a09675f90-system-cni-dir\") pod \"multus-additional-cni-plugins-jwnqb\" (UID: \"8e6d8633-b9de-4e37-96e7-465a09675f90\") " pod="openshift-multus/multus-additional-cni-plugins-jwnqb" Apr 23 08:48:05.992702 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.991157 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4b89f18f-6e25-4059-988c-27d5c1a39867-multus-cni-dir\") pod \"multus-h2vsr\" (UID: \"4b89f18f-6e25-4059-988c-27d5c1a39867\") " pod="openshift-multus/multus-h2vsr" Apr 23 08:48:05.992702 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.991180 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4b89f18f-6e25-4059-988c-27d5c1a39867-etc-kubernetes\") pod \"multus-h2vsr\" (UID: \"4b89f18f-6e25-4059-988c-27d5c1a39867\") " pod="openshift-multus/multus-h2vsr" Apr 23 08:48:05.992702 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.991205 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8963f0bc-78e1-4f93-a9a9-bba51f04c437-socket-dir\") pod \"aws-ebs-csi-driver-node-jx26p\" (UID: \"8963f0bc-78e1-4f93-a9a9-bba51f04c437\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jx26p" Apr 23 08:48:05.992702 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.991224 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8e6d8633-b9de-4e37-96e7-465a09675f90-system-cni-dir\") pod \"multus-additional-cni-plugins-jwnqb\" (UID: \"8e6d8633-b9de-4e37-96e7-465a09675f90\") " pod="openshift-multus/multus-additional-cni-plugins-jwnqb" Apr 23 08:48:05.992702 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.991228 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/8963f0bc-78e1-4f93-a9a9-bba51f04c437-etc-selinux\") pod \"aws-ebs-csi-driver-node-jx26p\" (UID: \"8963f0bc-78e1-4f93-a9a9-bba51f04c437\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jx26p" Apr 23 08:48:05.992702 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.991272 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/0991960b-d8b1-454f-a21e-8de493704ad2-etc-sysconfig\") pod \"tuned-9jfcj\" (UID: \"0991960b-d8b1-454f-a21e-8de493704ad2\") " pod="openshift-cluster-node-tuning-operator/tuned-9jfcj" Apr 23 08:48:05.992702 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.991302 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/0991960b-d8b1-454f-a21e-8de493704ad2-etc-sysctl-conf\") pod \"tuned-9jfcj\" (UID: \"0991960b-d8b1-454f-a21e-8de493704ad2\") " pod="openshift-cluster-node-tuning-operator/tuned-9jfcj" Apr 23 08:48:05.992702 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.991379 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a0aeb874-3a11-4639-a579-36512ba94069-serviceca\") pod \"node-ca-z6xdx\" (UID: \"a0aeb874-3a11-4639-a579-36512ba94069\") " pod="openshift-image-registry/node-ca-z6xdx" Apr 23 08:48:05.992702 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.991404 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4b89f18f-6e25-4059-988c-27d5c1a39867-os-release\") pod \"multus-h2vsr\" (UID: \"4b89f18f-6e25-4059-988c-27d5c1a39867\") " pod="openshift-multus/multus-h2vsr" Apr 23 08:48:05.992702 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.991423 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5m94l\" (UniqueName: \"kubernetes.io/projected/4b89f18f-6e25-4059-988c-27d5c1a39867-kube-api-access-5m94l\") pod \"multus-h2vsr\" (UID: \"4b89f18f-6e25-4059-988c-27d5c1a39867\") " pod="openshift-multus/multus-h2vsr" Apr 23 08:48:05.992702 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.991440 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/8963f0bc-78e1-4f93-a9a9-bba51f04c437-sys-fs\") pod \"aws-ebs-csi-driver-node-jx26p\" (UID: \"8963f0bc-78e1-4f93-a9a9-bba51f04c437\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jx26p" Apr 23 08:48:05.992702 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.991455 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjngk\" (UniqueName: \"kubernetes.io/projected/8963f0bc-78e1-4f93-a9a9-bba51f04c437-kube-api-access-rjngk\") pod \"aws-ebs-csi-driver-node-jx26p\" (UID: \"8963f0bc-78e1-4f93-a9a9-bba51f04c437\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jx26p" Apr 23 08:48:05.992702 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.991468 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0991960b-d8b1-454f-a21e-8de493704ad2-host\") pod \"tuned-9jfcj\" (UID: \"0991960b-d8b1-454f-a21e-8de493704ad2\") " pod="openshift-cluster-node-tuning-operator/tuned-9jfcj" Apr 23 08:48:05.994106 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:05.994085 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b712f12e316b1a6ded9d349ca82d37a.slice/crio-bacf35f21a49e8dfb9ba6344f180610f9813c833090f721ad5cd2f95f7e0e686 WatchSource:0}: Error finding container bacf35f21a49e8dfb9ba6344f180610f9813c833090f721ad5cd2f95f7e0e686: Status 404 returned error can't find the container with id bacf35f21a49e8dfb9ba6344f180610f9813c833090f721ad5cd2f95f7e0e686 Apr 23 08:48:05.999218 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:05.999197 2579 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 23 08:48:06.000197 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.000173 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 08:48:06.002818 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.002796 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcfg5\" (UniqueName: \"kubernetes.io/projected/8e6d8633-b9de-4e37-96e7-465a09675f90-kube-api-access-zcfg5\") pod \"multus-additional-cni-plugins-jwnqb\" (UID: \"8e6d8633-b9de-4e37-96e7-465a09675f90\") " pod="openshift-multus/multus-additional-cni-plugins-jwnqb" Apr 23 08:48:06.003052 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.003015 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gclgm\" (UniqueName: \"kubernetes.io/projected/6d76e4ef-b56a-40fd-9c37-6eb55602fea4-kube-api-access-gclgm\") pod \"node-resolver-c68x8\" (UID: \"6d76e4ef-b56a-40fd-9c37-6eb55602fea4\") " pod="openshift-dns/node-resolver-c68x8" Apr 23 08:48:06.007595 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:06.007574 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod438108d93a37ab59c6a0c9e57eee327c.slice/crio-4eee5d1830d6142a707da1e372517c9615c8c46dd04cab846a85b567ee905b80 WatchSource:0}: Error finding container 4eee5d1830d6142a707da1e372517c9615c8c46dd04cab846a85b567ee905b80: Status 404 returned error can't find the container with id 4eee5d1830d6142a707da1e372517c9615c8c46dd04cab846a85b567ee905b80 Apr 23 08:48:06.013692 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.013650 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-47.ec2.internal" event={"ID":"438108d93a37ab59c6a0c9e57eee327c","Type":"ContainerStarted","Data":"4eee5d1830d6142a707da1e372517c9615c8c46dd04cab846a85b567ee905b80"} Apr 23 08:48:06.014603 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.014585 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-47.ec2.internal" event={"ID":"8b712f12e316b1a6ded9d349ca82d37a","Type":"ContainerStarted","Data":"bacf35f21a49e8dfb9ba6344f180610f9813c833090f721ad5cd2f95f7e0e686"} Apr 23 08:48:06.092622 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.092593 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/33cb4bff-3a7e-42a3-8d3d-e1c79d36437b-host-kubelet\") pod \"ovnkube-node-9bsnx\" (UID: \"33cb4bff-3a7e-42a3-8d3d-e1c79d36437b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bsnx" Apr 23 08:48:06.092622 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.092625 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/33cb4bff-3a7e-42a3-8d3d-e1c79d36437b-host-slash\") pod \"ovnkube-node-9bsnx\" (UID: \"33cb4bff-3a7e-42a3-8d3d-e1c79d36437b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bsnx" Apr 23 08:48:06.092862 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.092642 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cvhb9\" (UniqueName: \"kubernetes.io/projected/33cb4bff-3a7e-42a3-8d3d-e1c79d36437b-kube-api-access-cvhb9\") pod \"ovnkube-node-9bsnx\" (UID: \"33cb4bff-3a7e-42a3-8d3d-e1c79d36437b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bsnx" Apr 23 08:48:06.092862 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.092658 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4b89f18f-6e25-4059-988c-27d5c1a39867-cni-binary-copy\") pod \"multus-h2vsr\" (UID: \"4b89f18f-6e25-4059-988c-27d5c1a39867\") " pod="openshift-multus/multus-h2vsr" Apr 23 08:48:06.092862 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.092672 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/4b89f18f-6e25-4059-988c-27d5c1a39867-multus-socket-dir-parent\") pod \"multus-h2vsr\" (UID: \"4b89f18f-6e25-4059-988c-27d5c1a39867\") " pod="openshift-multus/multus-h2vsr" Apr 23 08:48:06.092862 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.092689 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0991960b-d8b1-454f-a21e-8de493704ad2-run\") pod \"tuned-9jfcj\" (UID: \"0991960b-d8b1-454f-a21e-8de493704ad2\") " pod="openshift-cluster-node-tuning-operator/tuned-9jfcj" Apr 23 08:48:06.092862 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.092696 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/33cb4bff-3a7e-42a3-8d3d-e1c79d36437b-host-kubelet\") pod \"ovnkube-node-9bsnx\" (UID: \"33cb4bff-3a7e-42a3-8d3d-e1c79d36437b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bsnx" Apr 23 08:48:06.092862 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.092706 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/33cb4bff-3a7e-42a3-8d3d-e1c79d36437b-host-slash\") pod \"ovnkube-node-9bsnx\" (UID: \"33cb4bff-3a7e-42a3-8d3d-e1c79d36437b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bsnx" Apr 23 08:48:06.092862 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.092703 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4b89f18f-6e25-4059-988c-27d5c1a39867-cnibin\") pod \"multus-h2vsr\" (UID: \"4b89f18f-6e25-4059-988c-27d5c1a39867\") " pod="openshift-multus/multus-h2vsr" Apr 23 08:48:06.092862 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.092766 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/4b89f18f-6e25-4059-988c-27d5c1a39867-multus-socket-dir-parent\") pod \"multus-h2vsr\" (UID: \"4b89f18f-6e25-4059-988c-27d5c1a39867\") " pod="openshift-multus/multus-h2vsr" Apr 23 08:48:06.092862 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.092780 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4b89f18f-6e25-4059-988c-27d5c1a39867-cnibin\") pod \"multus-h2vsr\" (UID: \"4b89f18f-6e25-4059-988c-27d5c1a39867\") " pod="openshift-multus/multus-h2vsr" Apr 23 08:48:06.092862 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.092769 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/4b89f18f-6e25-4059-988c-27d5c1a39867-host-var-lib-cni-multus\") pod \"multus-h2vsr\" (UID: \"4b89f18f-6e25-4059-988c-27d5c1a39867\") " pod="openshift-multus/multus-h2vsr" Apr 23 08:48:06.092862 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.092786 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0991960b-d8b1-454f-a21e-8de493704ad2-run\") pod \"tuned-9jfcj\" (UID: \"0991960b-d8b1-454f-a21e-8de493704ad2\") " pod="openshift-cluster-node-tuning-operator/tuned-9jfcj" Apr 23 08:48:06.092862 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.092795 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/4b89f18f-6e25-4059-988c-27d5c1a39867-host-var-lib-cni-multus\") pod \"multus-h2vsr\" (UID: \"4b89f18f-6e25-4059-988c-27d5c1a39867\") " pod="openshift-multus/multus-h2vsr" Apr 23 08:48:06.092862 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.092817 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4b89f18f-6e25-4059-988c-27d5c1a39867-multus-daemon-config\") pod \"multus-h2vsr\" (UID: \"4b89f18f-6e25-4059-988c-27d5c1a39867\") " pod="openshift-multus/multus-h2vsr" Apr 23 08:48:06.092862 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.092845 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/33cb4bff-3a7e-42a3-8d3d-e1c79d36437b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9bsnx\" (UID: \"33cb4bff-3a7e-42a3-8d3d-e1c79d36437b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bsnx" Apr 23 08:48:06.093500 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.092871 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bkwpr\" (UniqueName: \"kubernetes.io/projected/83905907-ce4f-4b05-a10c-1a78044f595e-kube-api-access-bkwpr\") pod \"iptables-alerter-zf6q5\" (UID: \"83905907-ce4f-4b05-a10c-1a78044f595e\") " pod="openshift-network-operator/iptables-alerter-zf6q5" Apr 23 08:48:06.093500 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.092897 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8963f0bc-78e1-4f93-a9a9-bba51f04c437-registration-dir\") pod \"aws-ebs-csi-driver-node-jx26p\" (UID: \"8963f0bc-78e1-4f93-a9a9-bba51f04c437\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jx26p" Apr 23 08:48:06.093500 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.092921 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0991960b-d8b1-454f-a21e-8de493704ad2-tmp\") pod \"tuned-9jfcj\" (UID: \"0991960b-d8b1-454f-a21e-8de493704ad2\") " pod="openshift-cluster-node-tuning-operator/tuned-9jfcj" Apr 23 08:48:06.093500 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.092930 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/33cb4bff-3a7e-42a3-8d3d-e1c79d36437b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9bsnx\" (UID: \"33cb4bff-3a7e-42a3-8d3d-e1c79d36437b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bsnx" Apr 23 08:48:06.093500 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.092947 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4b89f18f-6e25-4059-988c-27d5c1a39867-host-var-lib-kubelet\") pod \"multus-h2vsr\" (UID: \"4b89f18f-6e25-4059-988c-27d5c1a39867\") " pod="openshift-multus/multus-h2vsr" Apr 23 08:48:06.093500 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.092971 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/33cb4bff-3a7e-42a3-8d3d-e1c79d36437b-etc-openvswitch\") pod \"ovnkube-node-9bsnx\" (UID: \"33cb4bff-3a7e-42a3-8d3d-e1c79d36437b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bsnx" Apr 23 08:48:06.093500 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.092975 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8963f0bc-78e1-4f93-a9a9-bba51f04c437-registration-dir\") pod \"aws-ebs-csi-driver-node-jx26p\" (UID: \"8963f0bc-78e1-4f93-a9a9-bba51f04c437\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jx26p" Apr 23 08:48:06.093500 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.093003 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4b89f18f-6e25-4059-988c-27d5c1a39867-host-var-lib-kubelet\") pod \"multus-h2vsr\" (UID: \"4b89f18f-6e25-4059-988c-27d5c1a39867\") " pod="openshift-multus/multus-h2vsr" Apr 23 08:48:06.093500 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.093019 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/33cb4bff-3a7e-42a3-8d3d-e1c79d36437b-etc-openvswitch\") pod \"ovnkube-node-9bsnx\" (UID: \"33cb4bff-3a7e-42a3-8d3d-e1c79d36437b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bsnx" Apr 23 08:48:06.093500 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.093031 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/33cb4bff-3a7e-42a3-8d3d-e1c79d36437b-systemd-units\") pod \"ovnkube-node-9bsnx\" (UID: \"33cb4bff-3a7e-42a3-8d3d-e1c79d36437b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bsnx" Apr 23 08:48:06.093500 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.093057 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/33cb4bff-3a7e-42a3-8d3d-e1c79d36437b-log-socket\") pod \"ovnkube-node-9bsnx\" (UID: \"33cb4bff-3a7e-42a3-8d3d-e1c79d36437b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bsnx" Apr 23 08:48:06.093500 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.093064 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/33cb4bff-3a7e-42a3-8d3d-e1c79d36437b-systemd-units\") pod \"ovnkube-node-9bsnx\" (UID: \"33cb4bff-3a7e-42a3-8d3d-e1c79d36437b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bsnx" Apr 23 08:48:06.093500 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.093092 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/33cb4bff-3a7e-42a3-8d3d-e1c79d36437b-host-run-ovn-kubernetes\") pod \"ovnkube-node-9bsnx\" (UID: \"33cb4bff-3a7e-42a3-8d3d-e1c79d36437b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bsnx" Apr 23 08:48:06.093500 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.093117 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/33cb4bff-3a7e-42a3-8d3d-e1c79d36437b-env-overrides\") pod \"ovnkube-node-9bsnx\" (UID: \"33cb4bff-3a7e-42a3-8d3d-e1c79d36437b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bsnx" Apr 23 08:48:06.093500 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.093140 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/33cb4bff-3a7e-42a3-8d3d-e1c79d36437b-host-run-ovn-kubernetes\") pod \"ovnkube-node-9bsnx\" (UID: \"33cb4bff-3a7e-42a3-8d3d-e1c79d36437b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bsnx" Apr 23 08:48:06.093500 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.093144 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4b89f18f-6e25-4059-988c-27d5c1a39867-multus-cni-dir\") pod \"multus-h2vsr\" (UID: \"4b89f18f-6e25-4059-988c-27d5c1a39867\") " pod="openshift-multus/multus-h2vsr" Apr 23 08:48:06.093500 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.093118 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/33cb4bff-3a7e-42a3-8d3d-e1c79d36437b-log-socket\") pod \"ovnkube-node-9bsnx\" (UID: \"33cb4bff-3a7e-42a3-8d3d-e1c79d36437b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bsnx" Apr 23 08:48:06.094377 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.093171 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4b89f18f-6e25-4059-988c-27d5c1a39867-etc-kubernetes\") pod \"multus-h2vsr\" (UID: \"4b89f18f-6e25-4059-988c-27d5c1a39867\") " pod="openshift-multus/multus-h2vsr" Apr 23 08:48:06.094377 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.093208 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4b89f18f-6e25-4059-988c-27d5c1a39867-multus-cni-dir\") pod \"multus-h2vsr\" (UID: \"4b89f18f-6e25-4059-988c-27d5c1a39867\") " pod="openshift-multus/multus-h2vsr" Apr 23 08:48:06.094377 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.093206 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4b89f18f-6e25-4059-988c-27d5c1a39867-etc-kubernetes\") pod \"multus-h2vsr\" (UID: \"4b89f18f-6e25-4059-988c-27d5c1a39867\") " pod="openshift-multus/multus-h2vsr" Apr 23 08:48:06.094377 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.093235 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8963f0bc-78e1-4f93-a9a9-bba51f04c437-socket-dir\") pod \"aws-ebs-csi-driver-node-jx26p\" (UID: \"8963f0bc-78e1-4f93-a9a9-bba51f04c437\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jx26p" Apr 23 08:48:06.094377 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.093273 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/8963f0bc-78e1-4f93-a9a9-bba51f04c437-etc-selinux\") pod \"aws-ebs-csi-driver-node-jx26p\" (UID: \"8963f0bc-78e1-4f93-a9a9-bba51f04c437\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jx26p" Apr 23 08:48:06.094377 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.093298 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/0991960b-d8b1-454f-a21e-8de493704ad2-etc-sysconfig\") pod \"tuned-9jfcj\" (UID: \"0991960b-d8b1-454f-a21e-8de493704ad2\") " pod="openshift-cluster-node-tuning-operator/tuned-9jfcj" Apr 23 08:48:06.094377 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.093321 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/0991960b-d8b1-454f-a21e-8de493704ad2-etc-sysctl-conf\") pod \"tuned-9jfcj\" (UID: \"0991960b-d8b1-454f-a21e-8de493704ad2\") " pod="openshift-cluster-node-tuning-operator/tuned-9jfcj" Apr 23 08:48:06.094377 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.093335 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/8963f0bc-78e1-4f93-a9a9-bba51f04c437-etc-selinux\") pod \"aws-ebs-csi-driver-node-jx26p\" (UID: \"8963f0bc-78e1-4f93-a9a9-bba51f04c437\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jx26p" Apr 23 08:48:06.094377 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.093331 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4b89f18f-6e25-4059-988c-27d5c1a39867-cni-binary-copy\") pod \"multus-h2vsr\" (UID: \"4b89f18f-6e25-4059-988c-27d5c1a39867\") " pod="openshift-multus/multus-h2vsr" Apr 23 08:48:06.094377 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.093367 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a0aeb874-3a11-4639-a579-36512ba94069-serviceca\") pod \"node-ca-z6xdx\" (UID: \"a0aeb874-3a11-4639-a579-36512ba94069\") " pod="openshift-image-registry/node-ca-z6xdx" Apr 23 08:48:06.094377 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.093393 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4b89f18f-6e25-4059-988c-27d5c1a39867-os-release\") pod \"multus-h2vsr\" (UID: \"4b89f18f-6e25-4059-988c-27d5c1a39867\") " pod="openshift-multus/multus-h2vsr" Apr 23 08:48:06.094377 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.093401 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/0991960b-d8b1-454f-a21e-8de493704ad2-etc-sysconfig\") pod \"tuned-9jfcj\" (UID: \"0991960b-d8b1-454f-a21e-8de493704ad2\") " pod="openshift-cluster-node-tuning-operator/tuned-9jfcj" Apr 23 08:48:06.094377 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.093410 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4b89f18f-6e25-4059-988c-27d5c1a39867-multus-daemon-config\") pod \"multus-h2vsr\" (UID: \"4b89f18f-6e25-4059-988c-27d5c1a39867\") " pod="openshift-multus/multus-h2vsr" Apr 23 08:48:06.094377 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.093416 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5m94l\" (UniqueName: \"kubernetes.io/projected/4b89f18f-6e25-4059-988c-27d5c1a39867-kube-api-access-5m94l\") pod \"multus-h2vsr\" (UID: \"4b89f18f-6e25-4059-988c-27d5c1a39867\") " pod="openshift-multus/multus-h2vsr" Apr 23 08:48:06.094377 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.093470 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/8963f0bc-78e1-4f93-a9a9-bba51f04c437-sys-fs\") pod \"aws-ebs-csi-driver-node-jx26p\" (UID: \"8963f0bc-78e1-4f93-a9a9-bba51f04c437\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jx26p" Apr 23 08:48:06.094377 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.093460 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4b89f18f-6e25-4059-988c-27d5c1a39867-os-release\") pod \"multus-h2vsr\" (UID: \"4b89f18f-6e25-4059-988c-27d5c1a39867\") " pod="openshift-multus/multus-h2vsr" Apr 23 08:48:06.094377 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.093494 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/0991960b-d8b1-454f-a21e-8de493704ad2-etc-sysctl-conf\") pod \"tuned-9jfcj\" (UID: \"0991960b-d8b1-454f-a21e-8de493704ad2\") " pod="openshift-cluster-node-tuning-operator/tuned-9jfcj" Apr 23 08:48:06.094377 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.093362 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8963f0bc-78e1-4f93-a9a9-bba51f04c437-socket-dir\") pod \"aws-ebs-csi-driver-node-jx26p\" (UID: \"8963f0bc-78e1-4f93-a9a9-bba51f04c437\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jx26p" Apr 23 08:48:06.095304 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.093508 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rjngk\" (UniqueName: \"kubernetes.io/projected/8963f0bc-78e1-4f93-a9a9-bba51f04c437-kube-api-access-rjngk\") pod \"aws-ebs-csi-driver-node-jx26p\" (UID: \"8963f0bc-78e1-4f93-a9a9-bba51f04c437\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jx26p" Apr 23 08:48:06.095304 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.093528 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/8963f0bc-78e1-4f93-a9a9-bba51f04c437-sys-fs\") pod \"aws-ebs-csi-driver-node-jx26p\" (UID: \"8963f0bc-78e1-4f93-a9a9-bba51f04c437\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jx26p" Apr 23 08:48:06.095304 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.093534 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0991960b-d8b1-454f-a21e-8de493704ad2-host\") pod \"tuned-9jfcj\" (UID: \"0991960b-d8b1-454f-a21e-8de493704ad2\") " pod="openshift-cluster-node-tuning-operator/tuned-9jfcj" Apr 23 08:48:06.095304 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.093556 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0991960b-d8b1-454f-a21e-8de493704ad2-host\") pod \"tuned-9jfcj\" (UID: \"0991960b-d8b1-454f-a21e-8de493704ad2\") " pod="openshift-cluster-node-tuning-operator/tuned-9jfcj" Apr 23 08:48:06.095304 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.093566 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/33cb4bff-3a7e-42a3-8d3d-e1c79d36437b-run-systemd\") pod \"ovnkube-node-9bsnx\" (UID: \"33cb4bff-3a7e-42a3-8d3d-e1c79d36437b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bsnx" Apr 23 08:48:06.095304 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.093587 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/33cb4bff-3a7e-42a3-8d3d-e1c79d36437b-env-overrides\") pod \"ovnkube-node-9bsnx\" (UID: \"33cb4bff-3a7e-42a3-8d3d-e1c79d36437b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bsnx" Apr 23 08:48:06.095304 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.093592 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/33cb4bff-3a7e-42a3-8d3d-e1c79d36437b-host-cni-bin\") pod \"ovnkube-node-9bsnx\" (UID: \"33cb4bff-3a7e-42a3-8d3d-e1c79d36437b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bsnx" Apr 23 08:48:06.095304 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.093617 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/83905907-ce4f-4b05-a10c-1a78044f595e-iptables-alerter-script\") pod \"iptables-alerter-zf6q5\" (UID: \"83905907-ce4f-4b05-a10c-1a78044f595e\") " pod="openshift-network-operator/iptables-alerter-zf6q5" Apr 23 08:48:06.095304 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.093632 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/33cb4bff-3a7e-42a3-8d3d-e1c79d36437b-run-systemd\") pod \"ovnkube-node-9bsnx\" (UID: \"33cb4bff-3a7e-42a3-8d3d-e1c79d36437b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bsnx" Apr 23 08:48:06.095304 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.093643 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/52011e07-e36d-48b0-bed0-421685c0e544-agent-certs\") pod \"konnectivity-agent-6hwnm\" (UID: \"52011e07-e36d-48b0-bed0-421685c0e544\") " pod="kube-system/konnectivity-agent-6hwnm" Apr 23 08:48:06.095304 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.093647 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/33cb4bff-3a7e-42a3-8d3d-e1c79d36437b-host-cni-bin\") pod \"ovnkube-node-9bsnx\" (UID: \"33cb4bff-3a7e-42a3-8d3d-e1c79d36437b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bsnx" Apr 23 08:48:06.095304 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.093666 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/0991960b-d8b1-454f-a21e-8de493704ad2-etc-sysctl-d\") pod \"tuned-9jfcj\" (UID: \"0991960b-d8b1-454f-a21e-8de493704ad2\") " pod="openshift-cluster-node-tuning-operator/tuned-9jfcj" Apr 23 08:48:06.095304 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.093746 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a0aeb874-3a11-4639-a579-36512ba94069-serviceca\") pod \"node-ca-z6xdx\" (UID: \"a0aeb874-3a11-4639-a579-36512ba94069\") " pod="openshift-image-registry/node-ca-z6xdx" Apr 23 08:48:06.095304 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.093774 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/0991960b-d8b1-454f-a21e-8de493704ad2-etc-systemd\") pod \"tuned-9jfcj\" (UID: \"0991960b-d8b1-454f-a21e-8de493704ad2\") " pod="openshift-cluster-node-tuning-operator/tuned-9jfcj" Apr 23 08:48:06.095304 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.093804 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0991960b-d8b1-454f-a21e-8de493704ad2-lib-modules\") pod \"tuned-9jfcj\" (UID: \"0991960b-d8b1-454f-a21e-8de493704ad2\") " pod="openshift-cluster-node-tuning-operator/tuned-9jfcj" Apr 23 08:48:06.095304 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.093809 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/0991960b-d8b1-454f-a21e-8de493704ad2-etc-sysctl-d\") pod \"tuned-9jfcj\" (UID: \"0991960b-d8b1-454f-a21e-8de493704ad2\") " pod="openshift-cluster-node-tuning-operator/tuned-9jfcj" Apr 23 08:48:06.095304 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.093827 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a0aeb874-3a11-4639-a579-36512ba94069-host\") pod \"node-ca-z6xdx\" (UID: \"a0aeb874-3a11-4639-a579-36512ba94069\") " pod="openshift-image-registry/node-ca-z6xdx" Apr 23 08:48:06.095304 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.093860 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a0aeb874-3a11-4639-a579-36512ba94069-host\") pod \"node-ca-z6xdx\" (UID: \"a0aeb874-3a11-4639-a579-36512ba94069\") " pod="openshift-image-registry/node-ca-z6xdx" Apr 23 08:48:06.096031 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.093869 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/33cb4bff-3a7e-42a3-8d3d-e1c79d36437b-ovn-node-metrics-cert\") pod \"ovnkube-node-9bsnx\" (UID: \"33cb4bff-3a7e-42a3-8d3d-e1c79d36437b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bsnx" Apr 23 08:48:06.096031 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.093881 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0991960b-d8b1-454f-a21e-8de493704ad2-lib-modules\") pod \"tuned-9jfcj\" (UID: \"0991960b-d8b1-454f-a21e-8de493704ad2\") " pod="openshift-cluster-node-tuning-operator/tuned-9jfcj" Apr 23 08:48:06.096031 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.093887 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/33cb4bff-3a7e-42a3-8d3d-e1c79d36437b-host-run-netns\") pod \"ovnkube-node-9bsnx\" (UID: \"33cb4bff-3a7e-42a3-8d3d-e1c79d36437b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bsnx" Apr 23 08:48:06.096031 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.093835 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/0991960b-d8b1-454f-a21e-8de493704ad2-etc-systemd\") pod \"tuned-9jfcj\" (UID: \"0991960b-d8b1-454f-a21e-8de493704ad2\") " pod="openshift-cluster-node-tuning-operator/tuned-9jfcj" Apr 23 08:48:06.096031 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.093904 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/33cb4bff-3a7e-42a3-8d3d-e1c79d36437b-ovnkube-script-lib\") pod \"ovnkube-node-9bsnx\" (UID: \"33cb4bff-3a7e-42a3-8d3d-e1c79d36437b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bsnx" Apr 23 08:48:06.096031 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.093923 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/33cb4bff-3a7e-42a3-8d3d-e1c79d36437b-host-run-netns\") pod \"ovnkube-node-9bsnx\" (UID: \"33cb4bff-3a7e-42a3-8d3d-e1c79d36437b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bsnx" Apr 23 08:48:06.096031 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.093929 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/4b89f18f-6e25-4059-988c-27d5c1a39867-host-run-k8s-cni-cncf-io\") pod \"multus-h2vsr\" (UID: \"4b89f18f-6e25-4059-988c-27d5c1a39867\") " pod="openshift-multus/multus-h2vsr" Apr 23 08:48:06.096031 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.093956 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/4b89f18f-6e25-4059-988c-27d5c1a39867-host-run-multus-certs\") pod \"multus-h2vsr\" (UID: \"4b89f18f-6e25-4059-988c-27d5c1a39867\") " pod="openshift-multus/multus-h2vsr" Apr 23 08:48:06.096031 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.093959 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/4b89f18f-6e25-4059-988c-27d5c1a39867-host-run-k8s-cni-cncf-io\") pod \"multus-h2vsr\" (UID: \"4b89f18f-6e25-4059-988c-27d5c1a39867\") " pod="openshift-multus/multus-h2vsr" Apr 23 08:48:06.096031 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.093971 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0991960b-d8b1-454f-a21e-8de493704ad2-sys\") pod \"tuned-9jfcj\" (UID: \"0991960b-d8b1-454f-a21e-8de493704ad2\") " pod="openshift-cluster-node-tuning-operator/tuned-9jfcj" Apr 23 08:48:06.096031 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.093998 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0991960b-d8b1-454f-a21e-8de493704ad2-sys\") pod \"tuned-9jfcj\" (UID: \"0991960b-d8b1-454f-a21e-8de493704ad2\") " pod="openshift-cluster-node-tuning-operator/tuned-9jfcj" Apr 23 08:48:06.096031 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.093999 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dsd6b\" (UniqueName: \"kubernetes.io/projected/0991960b-d8b1-454f-a21e-8de493704ad2-kube-api-access-dsd6b\") pod \"tuned-9jfcj\" (UID: \"0991960b-d8b1-454f-a21e-8de493704ad2\") " pod="openshift-cluster-node-tuning-operator/tuned-9jfcj" Apr 23 08:48:06.096031 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.094024 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/4b89f18f-6e25-4059-988c-27d5c1a39867-host-run-multus-certs\") pod \"multus-h2vsr\" (UID: \"4b89f18f-6e25-4059-988c-27d5c1a39867\") " pod="openshift-multus/multus-h2vsr" Apr 23 08:48:06.096031 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.094034 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/33cb4bff-3a7e-42a3-8d3d-e1c79d36437b-var-lib-openvswitch\") pod \"ovnkube-node-9bsnx\" (UID: \"33cb4bff-3a7e-42a3-8d3d-e1c79d36437b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bsnx" Apr 23 08:48:06.096031 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.094059 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/33cb4bff-3a7e-42a3-8d3d-e1c79d36437b-run-openvswitch\") pod \"ovnkube-node-9bsnx\" (UID: \"33cb4bff-3a7e-42a3-8d3d-e1c79d36437b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bsnx" Apr 23 08:48:06.096031 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.094075 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/83905907-ce4f-4b05-a10c-1a78044f595e-host-slash\") pod \"iptables-alerter-zf6q5\" (UID: \"83905907-ce4f-4b05-a10c-1a78044f595e\") " pod="openshift-network-operator/iptables-alerter-zf6q5" Apr 23 08:48:06.096031 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.094090 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/8963f0bc-78e1-4f93-a9a9-bba51f04c437-device-dir\") pod \"aws-ebs-csi-driver-node-jx26p\" (UID: \"8963f0bc-78e1-4f93-a9a9-bba51f04c437\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jx26p" Apr 23 08:48:06.096546 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.094107 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/0991960b-d8b1-454f-a21e-8de493704ad2-etc-modprobe-d\") pod \"tuned-9jfcj\" (UID: \"0991960b-d8b1-454f-a21e-8de493704ad2\") " pod="openshift-cluster-node-tuning-operator/tuned-9jfcj" Apr 23 08:48:06.096546 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.094131 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/33cb4bff-3a7e-42a3-8d3d-e1c79d36437b-node-log\") pod \"ovnkube-node-9bsnx\" (UID: \"33cb4bff-3a7e-42a3-8d3d-e1c79d36437b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bsnx" Apr 23 08:48:06.096546 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.094137 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/33cb4bff-3a7e-42a3-8d3d-e1c79d36437b-run-openvswitch\") pod \"ovnkube-node-9bsnx\" (UID: \"33cb4bff-3a7e-42a3-8d3d-e1c79d36437b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bsnx" Apr 23 08:48:06.096546 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.094156 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/52011e07-e36d-48b0-bed0-421685c0e544-konnectivity-ca\") pod \"konnectivity-agent-6hwnm\" (UID: \"52011e07-e36d-48b0-bed0-421685c0e544\") " pod="kube-system/konnectivity-agent-6hwnm" Apr 23 08:48:06.096546 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.094173 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/33cb4bff-3a7e-42a3-8d3d-e1c79d36437b-var-lib-openvswitch\") pod \"ovnkube-node-9bsnx\" (UID: \"33cb4bff-3a7e-42a3-8d3d-e1c79d36437b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bsnx" Apr 23 08:48:06.096546 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.094181 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4b89f18f-6e25-4059-988c-27d5c1a39867-system-cni-dir\") pod \"multus-h2vsr\" (UID: \"4b89f18f-6e25-4059-988c-27d5c1a39867\") " pod="openshift-multus/multus-h2vsr" Apr 23 08:48:06.096546 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.094212 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/4b89f18f-6e25-4059-988c-27d5c1a39867-hostroot\") pod \"multus-h2vsr\" (UID: \"4b89f18f-6e25-4059-988c-27d5c1a39867\") " pod="openshift-multus/multus-h2vsr" Apr 23 08:48:06.096546 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.094220 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/33cb4bff-3a7e-42a3-8d3d-e1c79d36437b-node-log\") pod \"ovnkube-node-9bsnx\" (UID: \"33cb4bff-3a7e-42a3-8d3d-e1c79d36437b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bsnx" Apr 23 08:48:06.096546 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.094267 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/0991960b-d8b1-454f-a21e-8de493704ad2-etc-modprobe-d\") pod \"tuned-9jfcj\" (UID: \"0991960b-d8b1-454f-a21e-8de493704ad2\") " pod="openshift-cluster-node-tuning-operator/tuned-9jfcj" Apr 23 08:48:06.096546 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.094317 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/8963f0bc-78e1-4f93-a9a9-bba51f04c437-device-dir\") pod \"aws-ebs-csi-driver-node-jx26p\" (UID: \"8963f0bc-78e1-4f93-a9a9-bba51f04c437\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jx26p" Apr 23 08:48:06.096546 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.094325 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/83905907-ce4f-4b05-a10c-1a78044f595e-host-slash\") pod \"iptables-alerter-zf6q5\" (UID: \"83905907-ce4f-4b05-a10c-1a78044f595e\") " pod="openshift-network-operator/iptables-alerter-zf6q5" Apr 23 08:48:06.096546 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.094269 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4b89f18f-6e25-4059-988c-27d5c1a39867-system-cni-dir\") pod \"multus-h2vsr\" (UID: \"4b89f18f-6e25-4059-988c-27d5c1a39867\") " pod="openshift-multus/multus-h2vsr" Apr 23 08:48:06.096546 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.094268 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/4b89f18f-6e25-4059-988c-27d5c1a39867-hostroot\") pod \"multus-h2vsr\" (UID: \"4b89f18f-6e25-4059-988c-27d5c1a39867\") " pod="openshift-multus/multus-h2vsr" Apr 23 08:48:06.096546 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.094298 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lshnm\" (UniqueName: \"kubernetes.io/projected/a0aeb874-3a11-4639-a579-36512ba94069-kube-api-access-lshnm\") pod \"node-ca-z6xdx\" (UID: \"a0aeb874-3a11-4639-a579-36512ba94069\") " pod="openshift-image-registry/node-ca-z6xdx" Apr 23 08:48:06.096546 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.094423 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/33cb4bff-3a7e-42a3-8d3d-e1c79d36437b-host-cni-netd\") pod \"ovnkube-node-9bsnx\" (UID: \"33cb4bff-3a7e-42a3-8d3d-e1c79d36437b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bsnx" Apr 23 08:48:06.096546 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.094457 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0991960b-d8b1-454f-a21e-8de493704ad2-var-lib-kubelet\") pod \"tuned-9jfcj\" (UID: \"0991960b-d8b1-454f-a21e-8de493704ad2\") " pod="openshift-cluster-node-tuning-operator/tuned-9jfcj" Apr 23 08:48:06.096546 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.094483 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f3e78a7f-f7cc-48d6-a29a-7d418c195aef-metrics-certs\") pod \"network-metrics-daemon-96m6d\" (UID: \"f3e78a7f-f7cc-48d6-a29a-7d418c195aef\") " pod="openshift-multus/network-metrics-daemon-96m6d" Apr 23 08:48:06.096546 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.094509 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p2sqx\" (UniqueName: \"kubernetes.io/projected/f3e78a7f-f7cc-48d6-a29a-7d418c195aef-kube-api-access-p2sqx\") pod \"network-metrics-daemon-96m6d\" (UID: \"f3e78a7f-f7cc-48d6-a29a-7d418c195aef\") " pod="openshift-multus/network-metrics-daemon-96m6d" Apr 23 08:48:06.097059 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.094533 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qfrd2\" (UniqueName: \"kubernetes.io/projected/bb54c9e2-2568-441d-a59a-ffa007afefd6-kube-api-access-qfrd2\") pod \"network-check-target-4mjb6\" (UID: \"bb54c9e2-2568-441d-a59a-ffa007afefd6\") " pod="openshift-network-diagnostics/network-check-target-4mjb6" Apr 23 08:48:06.097059 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.094537 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/33cb4bff-3a7e-42a3-8d3d-e1c79d36437b-ovnkube-script-lib\") pod \"ovnkube-node-9bsnx\" (UID: \"33cb4bff-3a7e-42a3-8d3d-e1c79d36437b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bsnx" Apr 23 08:48:06.097059 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.094561 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/33cb4bff-3a7e-42a3-8d3d-e1c79d36437b-run-ovn\") pod \"ovnkube-node-9bsnx\" (UID: \"33cb4bff-3a7e-42a3-8d3d-e1c79d36437b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bsnx" Apr 23 08:48:06.097059 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.094585 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4b89f18f-6e25-4059-988c-27d5c1a39867-host-var-lib-cni-bin\") pod \"multus-h2vsr\" (UID: \"4b89f18f-6e25-4059-988c-27d5c1a39867\") " pod="openshift-multus/multus-h2vsr" Apr 23 08:48:06.097059 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.094592 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/83905907-ce4f-4b05-a10c-1a78044f595e-iptables-alerter-script\") pod \"iptables-alerter-zf6q5\" (UID: \"83905907-ce4f-4b05-a10c-1a78044f595e\") " pod="openshift-network-operator/iptables-alerter-zf6q5" Apr 23 08:48:06.097059 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.094613 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8963f0bc-78e1-4f93-a9a9-bba51f04c437-kubelet-dir\") pod \"aws-ebs-csi-driver-node-jx26p\" (UID: \"8963f0bc-78e1-4f93-a9a9-bba51f04c437\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jx26p" Apr 23 08:48:06.097059 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:06.094633 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:48:06.097059 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.094638 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/0991960b-d8b1-454f-a21e-8de493704ad2-etc-tuned\") pod \"tuned-9jfcj\" (UID: \"0991960b-d8b1-454f-a21e-8de493704ad2\") " pod="openshift-cluster-node-tuning-operator/tuned-9jfcj" Apr 23 08:48:06.097059 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.094640 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/33cb4bff-3a7e-42a3-8d3d-e1c79d36437b-run-ovn\") pod \"ovnkube-node-9bsnx\" (UID: \"33cb4bff-3a7e-42a3-8d3d-e1c79d36437b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bsnx" Apr 23 08:48:06.097059 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.094662 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/33cb4bff-3a7e-42a3-8d3d-e1c79d36437b-ovnkube-config\") pod \"ovnkube-node-9bsnx\" (UID: \"33cb4bff-3a7e-42a3-8d3d-e1c79d36437b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bsnx" Apr 23 08:48:06.097059 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.094686 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4b89f18f-6e25-4059-988c-27d5c1a39867-host-run-netns\") pod \"multus-h2vsr\" (UID: \"4b89f18f-6e25-4059-988c-27d5c1a39867\") " pod="openshift-multus/multus-h2vsr" Apr 23 08:48:06.097059 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.094695 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8963f0bc-78e1-4f93-a9a9-bba51f04c437-kubelet-dir\") pod \"aws-ebs-csi-driver-node-jx26p\" (UID: \"8963f0bc-78e1-4f93-a9a9-bba51f04c437\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jx26p" Apr 23 08:48:06.097059 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:06.094703 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3e78a7f-f7cc-48d6-a29a-7d418c195aef-metrics-certs podName:f3e78a7f-f7cc-48d6-a29a-7d418c195aef nodeName:}" failed. No retries permitted until 2026-04-23 08:48:06.594683461 +0000 UTC m=+2.124163492 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f3e78a7f-f7cc-48d6-a29a-7d418c195aef-metrics-certs") pod "network-metrics-daemon-96m6d" (UID: "f3e78a7f-f7cc-48d6-a29a-7d418c195aef") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:48:06.097059 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.094730 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4b89f18f-6e25-4059-988c-27d5c1a39867-host-run-netns\") pod \"multus-h2vsr\" (UID: \"4b89f18f-6e25-4059-988c-27d5c1a39867\") " pod="openshift-multus/multus-h2vsr" Apr 23 08:48:06.097059 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.094762 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4b89f18f-6e25-4059-988c-27d5c1a39867-multus-conf-dir\") pod \"multus-h2vsr\" (UID: \"4b89f18f-6e25-4059-988c-27d5c1a39867\") " pod="openshift-multus/multus-h2vsr" Apr 23 08:48:06.097059 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.094791 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0991960b-d8b1-454f-a21e-8de493704ad2-etc-kubernetes\") pod \"tuned-9jfcj\" (UID: \"0991960b-d8b1-454f-a21e-8de493704ad2\") " pod="openshift-cluster-node-tuning-operator/tuned-9jfcj" Apr 23 08:48:06.097059 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.094860 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/33cb4bff-3a7e-42a3-8d3d-e1c79d36437b-host-cni-netd\") pod \"ovnkube-node-9bsnx\" (UID: \"33cb4bff-3a7e-42a3-8d3d-e1c79d36437b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bsnx" Apr 23 08:48:06.097562 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.094898 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4b89f18f-6e25-4059-988c-27d5c1a39867-multus-conf-dir\") pod \"multus-h2vsr\" (UID: \"4b89f18f-6e25-4059-988c-27d5c1a39867\") " pod="openshift-multus/multus-h2vsr" Apr 23 08:48:06.097562 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.094766 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4b89f18f-6e25-4059-988c-27d5c1a39867-host-var-lib-cni-bin\") pod \"multus-h2vsr\" (UID: \"4b89f18f-6e25-4059-988c-27d5c1a39867\") " pod="openshift-multus/multus-h2vsr" Apr 23 08:48:06.097562 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.094952 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0991960b-d8b1-454f-a21e-8de493704ad2-var-lib-kubelet\") pod \"tuned-9jfcj\" (UID: \"0991960b-d8b1-454f-a21e-8de493704ad2\") " pod="openshift-cluster-node-tuning-operator/tuned-9jfcj" Apr 23 08:48:06.097562 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.094980 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0991960b-d8b1-454f-a21e-8de493704ad2-etc-kubernetes\") pod \"tuned-9jfcj\" (UID: \"0991960b-d8b1-454f-a21e-8de493704ad2\") " pod="openshift-cluster-node-tuning-operator/tuned-9jfcj" Apr 23 08:48:06.097562 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.095156 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/52011e07-e36d-48b0-bed0-421685c0e544-konnectivity-ca\") pod \"konnectivity-agent-6hwnm\" (UID: \"52011e07-e36d-48b0-bed0-421685c0e544\") " pod="kube-system/konnectivity-agent-6hwnm" Apr 23 08:48:06.097562 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.095669 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/33cb4bff-3a7e-42a3-8d3d-e1c79d36437b-ovnkube-config\") pod \"ovnkube-node-9bsnx\" (UID: \"33cb4bff-3a7e-42a3-8d3d-e1c79d36437b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bsnx" Apr 23 08:48:06.097562 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.096091 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0991960b-d8b1-454f-a21e-8de493704ad2-tmp\") pod \"tuned-9jfcj\" (UID: \"0991960b-d8b1-454f-a21e-8de493704ad2\") " pod="openshift-cluster-node-tuning-operator/tuned-9jfcj" Apr 23 08:48:06.097562 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.096189 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/33cb4bff-3a7e-42a3-8d3d-e1c79d36437b-ovn-node-metrics-cert\") pod \"ovnkube-node-9bsnx\" (UID: \"33cb4bff-3a7e-42a3-8d3d-e1c79d36437b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bsnx" Apr 23 08:48:06.097562 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.096845 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/0991960b-d8b1-454f-a21e-8de493704ad2-etc-tuned\") pod \"tuned-9jfcj\" (UID: \"0991960b-d8b1-454f-a21e-8de493704ad2\") " pod="openshift-cluster-node-tuning-operator/tuned-9jfcj" Apr 23 08:48:06.097562 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.096897 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/52011e07-e36d-48b0-bed0-421685c0e544-agent-certs\") pod \"konnectivity-agent-6hwnm\" (UID: \"52011e07-e36d-48b0-bed0-421685c0e544\") " pod="kube-system/konnectivity-agent-6hwnm" Apr 23 08:48:06.100836 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:06.100810 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 08:48:06.100836 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:06.100839 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 08:48:06.101034 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:06.100852 2579 projected.go:194] Error preparing data for projected volume kube-api-access-qfrd2 for pod openshift-network-diagnostics/network-check-target-4mjb6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:48:06.101034 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:06.100917 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bb54c9e2-2568-441d-a59a-ffa007afefd6-kube-api-access-qfrd2 podName:bb54c9e2-2568-441d-a59a-ffa007afefd6 nodeName:}" failed. No retries permitted until 2026-04-23 08:48:06.600900016 +0000 UTC m=+2.130380020 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-qfrd2" (UniqueName: "kubernetes.io/projected/bb54c9e2-2568-441d-a59a-ffa007afefd6-kube-api-access-qfrd2") pod "network-check-target-4mjb6" (UID: "bb54c9e2-2568-441d-a59a-ffa007afefd6") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:48:06.101611 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.101587 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5m94l\" (UniqueName: \"kubernetes.io/projected/4b89f18f-6e25-4059-988c-27d5c1a39867-kube-api-access-5m94l\") pod \"multus-h2vsr\" (UID: \"4b89f18f-6e25-4059-988c-27d5c1a39867\") " pod="openshift-multus/multus-h2vsr" Apr 23 08:48:06.101713 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.101596 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvhb9\" (UniqueName: \"kubernetes.io/projected/33cb4bff-3a7e-42a3-8d3d-e1c79d36437b-kube-api-access-cvhb9\") pod \"ovnkube-node-9bsnx\" (UID: \"33cb4bff-3a7e-42a3-8d3d-e1c79d36437b\") " pod="openshift-ovn-kubernetes/ovnkube-node-9bsnx" Apr 23 08:48:06.101899 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.101882 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkwpr\" (UniqueName: \"kubernetes.io/projected/83905907-ce4f-4b05-a10c-1a78044f595e-kube-api-access-bkwpr\") pod \"iptables-alerter-zf6q5\" (UID: \"83905907-ce4f-4b05-a10c-1a78044f595e\") " pod="openshift-network-operator/iptables-alerter-zf6q5" Apr 23 08:48:06.102635 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.102614 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsd6b\" (UniqueName: \"kubernetes.io/projected/0991960b-d8b1-454f-a21e-8de493704ad2-kube-api-access-dsd6b\") pod \"tuned-9jfcj\" (UID: \"0991960b-d8b1-454f-a21e-8de493704ad2\") " pod="openshift-cluster-node-tuning-operator/tuned-9jfcj" Apr 23 08:48:06.102730 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.102713 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjngk\" (UniqueName: \"kubernetes.io/projected/8963f0bc-78e1-4f93-a9a9-bba51f04c437-kube-api-access-rjngk\") pod \"aws-ebs-csi-driver-node-jx26p\" (UID: \"8963f0bc-78e1-4f93-a9a9-bba51f04c437\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jx26p" Apr 23 08:48:06.102887 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.102873 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lshnm\" (UniqueName: \"kubernetes.io/projected/a0aeb874-3a11-4639-a579-36512ba94069-kube-api-access-lshnm\") pod \"node-ca-z6xdx\" (UID: \"a0aeb874-3a11-4639-a579-36512ba94069\") " pod="openshift-image-registry/node-ca-z6xdx" Apr 23 08:48:06.103044 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.103018 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2sqx\" (UniqueName: \"kubernetes.io/projected/f3e78a7f-f7cc-48d6-a29a-7d418c195aef-kube-api-access-p2sqx\") pod \"network-metrics-daemon-96m6d\" (UID: \"f3e78a7f-f7cc-48d6-a29a-7d418c195aef\") " pod="openshift-multus/network-metrics-daemon-96m6d" Apr 23 08:48:06.177649 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.177556 2579 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 08:48:06.204762 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.204732 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-c68x8" Apr 23 08:48:06.211131 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:06.211106 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d76e4ef_b56a_40fd_9c37_6eb55602fea4.slice/crio-f1e5eee638d27c2218b5bb3a8ef452f29a09672fa3d3015c7055887cec232601 WatchSource:0}: Error finding container f1e5eee638d27c2218b5bb3a8ef452f29a09672fa3d3015c7055887cec232601: Status 404 returned error can't find the container with id f1e5eee638d27c2218b5bb3a8ef452f29a09672fa3d3015c7055887cec232601 Apr 23 08:48:06.225166 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.225147 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-jwnqb" Apr 23 08:48:06.231284 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:06.231249 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e6d8633_b9de_4e37_96e7_465a09675f90.slice/crio-20ff39037babd6ded620e36921080f06105240e23b5effdd40b336ffca6cb1c7 WatchSource:0}: Error finding container 20ff39037babd6ded620e36921080f06105240e23b5effdd40b336ffca6cb1c7: Status 404 returned error can't find the container with id 20ff39037babd6ded620e36921080f06105240e23b5effdd40b336ffca6cb1c7 Apr 23 08:48:06.237146 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.237117 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jx26p" Apr 23 08:48:06.242741 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:06.242722 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8963f0bc_78e1_4f93_a9a9_bba51f04c437.slice/crio-0a4813d4b53edd29e02447453bf1389bc6fdf1fbbf79c225522075873cae5dbd WatchSource:0}: Error finding container 0a4813d4b53edd29e02447453bf1389bc6fdf1fbbf79c225522075873cae5dbd: Status 404 returned error can't find the container with id 0a4813d4b53edd29e02447453bf1389bc6fdf1fbbf79c225522075873cae5dbd Apr 23 08:48:06.249498 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.249480 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-zf6q5" Apr 23 08:48:06.253996 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.253981 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-6hwnm" Apr 23 08:48:06.255707 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:06.255686 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod83905907_ce4f_4b05_a10c_1a78044f595e.slice/crio-ce2f838549cfd608b7e1d78a22f0697ab20660f782baa54519b0d3bb61acd4ab WatchSource:0}: Error finding container ce2f838549cfd608b7e1d78a22f0697ab20660f782baa54519b0d3bb61acd4ab: Status 404 returned error can't find the container with id ce2f838549cfd608b7e1d78a22f0697ab20660f782baa54519b0d3bb61acd4ab Apr 23 08:48:06.260746 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:06.260725 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52011e07_e36d_48b0_bed0_421685c0e544.slice/crio-d449d0f075851c957b153acaa233c5f4760bc313487245ae1edde29bdfe7eade WatchSource:0}: Error finding container d449d0f075851c957b153acaa233c5f4760bc313487245ae1edde29bdfe7eade: Status 404 returned error can't find the container with id d449d0f075851c957b153acaa233c5f4760bc313487245ae1edde29bdfe7eade Apr 23 08:48:06.270971 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.270954 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-z6xdx" Apr 23 08:48:06.277119 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:06.277096 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0aeb874_3a11_4639_a579_36512ba94069.slice/crio-58579da1b23e08ab45b509ed8bd7fb59a9ce3d2b55666894a2318e7f78a15e71 WatchSource:0}: Error finding container 58579da1b23e08ab45b509ed8bd7fb59a9ce3d2b55666894a2318e7f78a15e71: Status 404 returned error can't find the container with id 58579da1b23e08ab45b509ed8bd7fb59a9ce3d2b55666894a2318e7f78a15e71 Apr 23 08:48:06.297125 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.297106 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-h2vsr" Apr 23 08:48:06.303191 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.303172 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-9bsnx" Apr 23 08:48:06.303526 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:06.303491 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b89f18f_6e25_4059_988c_27d5c1a39867.slice/crio-0ffbdb51598de596a8082926e3910e717d84b1302558b2272da7b60419283943 WatchSource:0}: Error finding container 0ffbdb51598de596a8082926e3910e717d84b1302558b2272da7b60419283943: Status 404 returned error can't find the container with id 0ffbdb51598de596a8082926e3910e717d84b1302558b2272da7b60419283943 Apr 23 08:48:06.304445 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.304428 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-9jfcj" Apr 23 08:48:06.310009 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:06.309987 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33cb4bff_3a7e_42a3_8d3d_e1c79d36437b.slice/crio-480aa938cfae7e48889d32233f8f55228dc0b779624251eb8e7c41d6643e2b49 WatchSource:0}: Error finding container 480aa938cfae7e48889d32233f8f55228dc0b779624251eb8e7c41d6643e2b49: Status 404 returned error can't find the container with id 480aa938cfae7e48889d32233f8f55228dc0b779624251eb8e7c41d6643e2b49 Apr 23 08:48:06.312906 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:48:06.312886 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0991960b_d8b1_454f_a21e_8de493704ad2.slice/crio-4e7c73ebd57b9b58788e171ffe87d4fd92c6078d863a98782b87b1dfdf52ffcd WatchSource:0}: Error finding container 4e7c73ebd57b9b58788e171ffe87d4fd92c6078d863a98782b87b1dfdf52ffcd: Status 404 returned error can't find the container with id 4e7c73ebd57b9b58788e171ffe87d4fd92c6078d863a98782b87b1dfdf52ffcd Apr 23 08:48:06.398590 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.398547 2579 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 08:48:06.599586 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.598831 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f3e78a7f-f7cc-48d6-a29a-7d418c195aef-metrics-certs\") pod \"network-metrics-daemon-96m6d\" (UID: \"f3e78a7f-f7cc-48d6-a29a-7d418c195aef\") " pod="openshift-multus/network-metrics-daemon-96m6d" Apr 23 08:48:06.599586 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:06.599047 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:48:06.599586 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:06.599113 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3e78a7f-f7cc-48d6-a29a-7d418c195aef-metrics-certs podName:f3e78a7f-f7cc-48d6-a29a-7d418c195aef nodeName:}" failed. No retries permitted until 2026-04-23 08:48:07.59909594 +0000 UTC m=+3.128575945 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f3e78a7f-f7cc-48d6-a29a-7d418c195aef-metrics-certs") pod "network-metrics-daemon-96m6d" (UID: "f3e78a7f-f7cc-48d6-a29a-7d418c195aef") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:48:06.700178 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.700142 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qfrd2\" (UniqueName: \"kubernetes.io/projected/bb54c9e2-2568-441d-a59a-ffa007afefd6-kube-api-access-qfrd2\") pod \"network-check-target-4mjb6\" (UID: \"bb54c9e2-2568-441d-a59a-ffa007afefd6\") " pod="openshift-network-diagnostics/network-check-target-4mjb6" Apr 23 08:48:06.700363 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:06.700318 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 08:48:06.700363 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:06.700335 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 08:48:06.700477 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:06.700368 2579 projected.go:194] Error preparing data for projected volume kube-api-access-qfrd2 for pod openshift-network-diagnostics/network-check-target-4mjb6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:48:06.700477 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:06.700422 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bb54c9e2-2568-441d-a59a-ffa007afefd6-kube-api-access-qfrd2 podName:bb54c9e2-2568-441d-a59a-ffa007afefd6 nodeName:}" failed. No retries permitted until 2026-04-23 08:48:07.700404104 +0000 UTC m=+3.229884113 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-qfrd2" (UniqueName: "kubernetes.io/projected/bb54c9e2-2568-441d-a59a-ffa007afefd6-kube-api-access-qfrd2") pod "network-check-target-4mjb6" (UID: "bb54c9e2-2568-441d-a59a-ffa007afefd6") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:48:06.920648 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.920559 2579 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 08:43:05 +0000 UTC" deadline="2028-01-18 01:36:04.660878988 +0000 UTC" Apr 23 08:48:06.920648 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:06.920596 2579 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15232h47m57.740287006s" Apr 23 08:48:07.014448 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:07.013897 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-96m6d" Apr 23 08:48:07.014448 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:07.014037 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-96m6d" podUID="f3e78a7f-f7cc-48d6-a29a-7d418c195aef" Apr 23 08:48:07.031278 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:07.031221 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-z6xdx" event={"ID":"a0aeb874-3a11-4639-a579-36512ba94069","Type":"ContainerStarted","Data":"58579da1b23e08ab45b509ed8bd7fb59a9ce3d2b55666894a2318e7f78a15e71"} Apr 23 08:48:07.040440 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:07.040377 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-6hwnm" event={"ID":"52011e07-e36d-48b0-bed0-421685c0e544","Type":"ContainerStarted","Data":"d449d0f075851c957b153acaa233c5f4760bc313487245ae1edde29bdfe7eade"} Apr 23 08:48:07.047062 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:07.047030 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jwnqb" event={"ID":"8e6d8633-b9de-4e37-96e7-465a09675f90","Type":"ContainerStarted","Data":"20ff39037babd6ded620e36921080f06105240e23b5effdd40b336ffca6cb1c7"} Apr 23 08:48:07.055652 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:07.055617 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-9jfcj" event={"ID":"0991960b-d8b1-454f-a21e-8de493704ad2","Type":"ContainerStarted","Data":"4e7c73ebd57b9b58788e171ffe87d4fd92c6078d863a98782b87b1dfdf52ffcd"} Apr 23 08:48:07.071096 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:07.071063 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9bsnx" event={"ID":"33cb4bff-3a7e-42a3-8d3d-e1c79d36437b","Type":"ContainerStarted","Data":"480aa938cfae7e48889d32233f8f55228dc0b779624251eb8e7c41d6643e2b49"} Apr 23 08:48:07.085040 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:07.084939 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-zf6q5" event={"ID":"83905907-ce4f-4b05-a10c-1a78044f595e","Type":"ContainerStarted","Data":"ce2f838549cfd608b7e1d78a22f0697ab20660f782baa54519b0d3bb61acd4ab"} Apr 23 08:48:07.098536 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:07.098505 2579 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 08:48:07.099053 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:07.099024 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jx26p" event={"ID":"8963f0bc-78e1-4f93-a9a9-bba51f04c437","Type":"ContainerStarted","Data":"0a4813d4b53edd29e02447453bf1389bc6fdf1fbbf79c225522075873cae5dbd"} Apr 23 08:48:07.103037 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:07.103002 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-c68x8" event={"ID":"6d76e4ef-b56a-40fd-9c37-6eb55602fea4","Type":"ContainerStarted","Data":"f1e5eee638d27c2218b5bb3a8ef452f29a09672fa3d3015c7055887cec232601"} Apr 23 08:48:07.111822 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:07.111774 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-h2vsr" event={"ID":"4b89f18f-6e25-4059-988c-27d5c1a39867","Type":"ContainerStarted","Data":"0ffbdb51598de596a8082926e3910e717d84b1302558b2272da7b60419283943"} Apr 23 08:48:07.611680 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:07.611638 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f3e78a7f-f7cc-48d6-a29a-7d418c195aef-metrics-certs\") pod \"network-metrics-daemon-96m6d\" (UID: \"f3e78a7f-f7cc-48d6-a29a-7d418c195aef\") " pod="openshift-multus/network-metrics-daemon-96m6d" Apr 23 08:48:07.611872 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:07.611852 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:48:07.611935 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:07.611917 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3e78a7f-f7cc-48d6-a29a-7d418c195aef-metrics-certs podName:f3e78a7f-f7cc-48d6-a29a-7d418c195aef nodeName:}" failed. No retries permitted until 2026-04-23 08:48:09.611898145 +0000 UTC m=+5.141378160 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f3e78a7f-f7cc-48d6-a29a-7d418c195aef-metrics-certs") pod "network-metrics-daemon-96m6d" (UID: "f3e78a7f-f7cc-48d6-a29a-7d418c195aef") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:48:07.712132 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:07.712089 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qfrd2\" (UniqueName: \"kubernetes.io/projected/bb54c9e2-2568-441d-a59a-ffa007afefd6-kube-api-access-qfrd2\") pod \"network-check-target-4mjb6\" (UID: \"bb54c9e2-2568-441d-a59a-ffa007afefd6\") " pod="openshift-network-diagnostics/network-check-target-4mjb6" Apr 23 08:48:07.712322 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:07.712241 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 08:48:07.712322 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:07.712259 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 08:48:07.712322 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:07.712272 2579 projected.go:194] Error preparing data for projected volume kube-api-access-qfrd2 for pod openshift-network-diagnostics/network-check-target-4mjb6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:48:07.712504 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:07.712329 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bb54c9e2-2568-441d-a59a-ffa007afefd6-kube-api-access-qfrd2 podName:bb54c9e2-2568-441d-a59a-ffa007afefd6 nodeName:}" failed. No retries permitted until 2026-04-23 08:48:09.71231115 +0000 UTC m=+5.241791171 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-qfrd2" (UniqueName: "kubernetes.io/projected/bb54c9e2-2568-441d-a59a-ffa007afefd6-kube-api-access-qfrd2") pod "network-check-target-4mjb6" (UID: "bb54c9e2-2568-441d-a59a-ffa007afefd6") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:48:07.921068 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:07.921018 2579 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 08:43:05 +0000 UTC" deadline="2028-01-23 12:57:33.382227668 +0000 UTC" Apr 23 08:48:07.921068 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:07.921064 2579 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15364h9m25.461168458s" Apr 23 08:48:08.012027 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:08.011989 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4mjb6" Apr 23 08:48:08.012187 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:08.012116 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4mjb6" podUID="bb54c9e2-2568-441d-a59a-ffa007afefd6" Apr 23 08:48:08.567286 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:08.567252 2579 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 08:48:09.012055 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:09.012024 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-96m6d" Apr 23 08:48:09.012605 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:09.012165 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-96m6d" podUID="f3e78a7f-f7cc-48d6-a29a-7d418c195aef" Apr 23 08:48:09.266471 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:09.266389 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-q6kd2"] Apr 23 08:48:09.269746 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:09.269229 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q6kd2" Apr 23 08:48:09.269746 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:09.269310 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q6kd2" podUID="f0ec8bf3-b2b1-4ced-9317-2706c95af066" Apr 23 08:48:09.327029 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:09.326753 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f0ec8bf3-b2b1-4ced-9317-2706c95af066-original-pull-secret\") pod \"global-pull-secret-syncer-q6kd2\" (UID: \"f0ec8bf3-b2b1-4ced-9317-2706c95af066\") " pod="kube-system/global-pull-secret-syncer-q6kd2" Apr 23 08:48:09.327029 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:09.326839 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/f0ec8bf3-b2b1-4ced-9317-2706c95af066-dbus\") pod \"global-pull-secret-syncer-q6kd2\" (UID: \"f0ec8bf3-b2b1-4ced-9317-2706c95af066\") " pod="kube-system/global-pull-secret-syncer-q6kd2" Apr 23 08:48:09.327029 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:09.326881 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/f0ec8bf3-b2b1-4ced-9317-2706c95af066-kubelet-config\") pod \"global-pull-secret-syncer-q6kd2\" (UID: \"f0ec8bf3-b2b1-4ced-9317-2706c95af066\") " pod="kube-system/global-pull-secret-syncer-q6kd2" Apr 23 08:48:09.427660 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:09.427609 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f0ec8bf3-b2b1-4ced-9317-2706c95af066-original-pull-secret\") pod \"global-pull-secret-syncer-q6kd2\" (UID: \"f0ec8bf3-b2b1-4ced-9317-2706c95af066\") " pod="kube-system/global-pull-secret-syncer-q6kd2" Apr 23 08:48:09.427660 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:09.427666 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/f0ec8bf3-b2b1-4ced-9317-2706c95af066-dbus\") pod \"global-pull-secret-syncer-q6kd2\" (UID: \"f0ec8bf3-b2b1-4ced-9317-2706c95af066\") " pod="kube-system/global-pull-secret-syncer-q6kd2" Apr 23 08:48:09.427900 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:09.427703 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/f0ec8bf3-b2b1-4ced-9317-2706c95af066-kubelet-config\") pod \"global-pull-secret-syncer-q6kd2\" (UID: \"f0ec8bf3-b2b1-4ced-9317-2706c95af066\") " pod="kube-system/global-pull-secret-syncer-q6kd2" Apr 23 08:48:09.427900 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:09.427804 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/f0ec8bf3-b2b1-4ced-9317-2706c95af066-kubelet-config\") pod \"global-pull-secret-syncer-q6kd2\" (UID: \"f0ec8bf3-b2b1-4ced-9317-2706c95af066\") " pod="kube-system/global-pull-secret-syncer-q6kd2" Apr 23 08:48:09.428000 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:09.427907 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 08:48:09.428000 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:09.427964 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f0ec8bf3-b2b1-4ced-9317-2706c95af066-original-pull-secret podName:f0ec8bf3-b2b1-4ced-9317-2706c95af066 nodeName:}" failed. No retries permitted until 2026-04-23 08:48:09.927945367 +0000 UTC m=+5.457425385 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/f0ec8bf3-b2b1-4ced-9317-2706c95af066-original-pull-secret") pod "global-pull-secret-syncer-q6kd2" (UID: "f0ec8bf3-b2b1-4ced-9317-2706c95af066") : object "kube-system"/"original-pull-secret" not registered Apr 23 08:48:09.428309 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:09.428206 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/f0ec8bf3-b2b1-4ced-9317-2706c95af066-dbus\") pod \"global-pull-secret-syncer-q6kd2\" (UID: \"f0ec8bf3-b2b1-4ced-9317-2706c95af066\") " pod="kube-system/global-pull-secret-syncer-q6kd2" Apr 23 08:48:09.628943 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:09.628909 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f3e78a7f-f7cc-48d6-a29a-7d418c195aef-metrics-certs\") pod \"network-metrics-daemon-96m6d\" (UID: \"f3e78a7f-f7cc-48d6-a29a-7d418c195aef\") " pod="openshift-multus/network-metrics-daemon-96m6d" Apr 23 08:48:09.629122 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:09.629105 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:48:09.629190 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:09.629176 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3e78a7f-f7cc-48d6-a29a-7d418c195aef-metrics-certs podName:f3e78a7f-f7cc-48d6-a29a-7d418c195aef nodeName:}" failed. No retries permitted until 2026-04-23 08:48:13.62915646 +0000 UTC m=+9.158636470 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f3e78a7f-f7cc-48d6-a29a-7d418c195aef-metrics-certs") pod "network-metrics-daemon-96m6d" (UID: "f3e78a7f-f7cc-48d6-a29a-7d418c195aef") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:48:09.729627 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:09.729587 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qfrd2\" (UniqueName: \"kubernetes.io/projected/bb54c9e2-2568-441d-a59a-ffa007afefd6-kube-api-access-qfrd2\") pod \"network-check-target-4mjb6\" (UID: \"bb54c9e2-2568-441d-a59a-ffa007afefd6\") " pod="openshift-network-diagnostics/network-check-target-4mjb6" Apr 23 08:48:09.729800 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:09.729747 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 08:48:09.729800 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:09.729773 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 08:48:09.729800 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:09.729789 2579 projected.go:194] Error preparing data for projected volume kube-api-access-qfrd2 for pod openshift-network-diagnostics/network-check-target-4mjb6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:48:09.729971 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:09.729861 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bb54c9e2-2568-441d-a59a-ffa007afefd6-kube-api-access-qfrd2 podName:bb54c9e2-2568-441d-a59a-ffa007afefd6 nodeName:}" failed. No retries permitted until 2026-04-23 08:48:13.729840558 +0000 UTC m=+9.259320565 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-qfrd2" (UniqueName: "kubernetes.io/projected/bb54c9e2-2568-441d-a59a-ffa007afefd6-kube-api-access-qfrd2") pod "network-check-target-4mjb6" (UID: "bb54c9e2-2568-441d-a59a-ffa007afefd6") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:48:09.931132 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:09.931049 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f0ec8bf3-b2b1-4ced-9317-2706c95af066-original-pull-secret\") pod \"global-pull-secret-syncer-q6kd2\" (UID: \"f0ec8bf3-b2b1-4ced-9317-2706c95af066\") " pod="kube-system/global-pull-secret-syncer-q6kd2" Apr 23 08:48:09.931284 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:09.931253 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 08:48:09.931378 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:09.931315 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f0ec8bf3-b2b1-4ced-9317-2706c95af066-original-pull-secret podName:f0ec8bf3-b2b1-4ced-9317-2706c95af066 nodeName:}" failed. No retries permitted until 2026-04-23 08:48:10.931296826 +0000 UTC m=+6.460776834 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/f0ec8bf3-b2b1-4ced-9317-2706c95af066-original-pull-secret") pod "global-pull-secret-syncer-q6kd2" (UID: "f0ec8bf3-b2b1-4ced-9317-2706c95af066") : object "kube-system"/"original-pull-secret" not registered Apr 23 08:48:10.011456 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:10.011418 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4mjb6" Apr 23 08:48:10.011637 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:10.011572 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4mjb6" podUID="bb54c9e2-2568-441d-a59a-ffa007afefd6" Apr 23 08:48:10.938241 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:10.938200 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f0ec8bf3-b2b1-4ced-9317-2706c95af066-original-pull-secret\") pod \"global-pull-secret-syncer-q6kd2\" (UID: \"f0ec8bf3-b2b1-4ced-9317-2706c95af066\") " pod="kube-system/global-pull-secret-syncer-q6kd2" Apr 23 08:48:10.938711 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:10.938400 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 08:48:10.938711 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:10.938471 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f0ec8bf3-b2b1-4ced-9317-2706c95af066-original-pull-secret podName:f0ec8bf3-b2b1-4ced-9317-2706c95af066 nodeName:}" failed. No retries permitted until 2026-04-23 08:48:12.938454565 +0000 UTC m=+8.467934577 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/f0ec8bf3-b2b1-4ced-9317-2706c95af066-original-pull-secret") pod "global-pull-secret-syncer-q6kd2" (UID: "f0ec8bf3-b2b1-4ced-9317-2706c95af066") : object "kube-system"/"original-pull-secret" not registered Apr 23 08:48:11.011945 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:11.011908 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q6kd2" Apr 23 08:48:11.012119 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:11.012033 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q6kd2" podUID="f0ec8bf3-b2b1-4ced-9317-2706c95af066" Apr 23 08:48:11.012637 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:11.012473 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-96m6d" Apr 23 08:48:11.012637 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:11.012598 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-96m6d" podUID="f3e78a7f-f7cc-48d6-a29a-7d418c195aef" Apr 23 08:48:12.012262 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:12.012226 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4mjb6" Apr 23 08:48:12.012761 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:12.012377 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4mjb6" podUID="bb54c9e2-2568-441d-a59a-ffa007afefd6" Apr 23 08:48:12.965288 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:12.964689 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f0ec8bf3-b2b1-4ced-9317-2706c95af066-original-pull-secret\") pod \"global-pull-secret-syncer-q6kd2\" (UID: \"f0ec8bf3-b2b1-4ced-9317-2706c95af066\") " pod="kube-system/global-pull-secret-syncer-q6kd2" Apr 23 08:48:12.965288 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:12.964869 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 08:48:12.965288 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:12.964932 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f0ec8bf3-b2b1-4ced-9317-2706c95af066-original-pull-secret podName:f0ec8bf3-b2b1-4ced-9317-2706c95af066 nodeName:}" failed. No retries permitted until 2026-04-23 08:48:16.964912777 +0000 UTC m=+12.494392796 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/f0ec8bf3-b2b1-4ced-9317-2706c95af066-original-pull-secret") pod "global-pull-secret-syncer-q6kd2" (UID: "f0ec8bf3-b2b1-4ced-9317-2706c95af066") : object "kube-system"/"original-pull-secret" not registered Apr 23 08:48:13.013528 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:13.013487 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q6kd2" Apr 23 08:48:13.013973 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:13.013631 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q6kd2" podUID="f0ec8bf3-b2b1-4ced-9317-2706c95af066" Apr 23 08:48:13.013973 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:13.013691 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-96m6d" Apr 23 08:48:13.013973 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:13.013823 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-96m6d" podUID="f3e78a7f-f7cc-48d6-a29a-7d418c195aef" Apr 23 08:48:13.670261 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:13.670216 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f3e78a7f-f7cc-48d6-a29a-7d418c195aef-metrics-certs\") pod \"network-metrics-daemon-96m6d\" (UID: \"f3e78a7f-f7cc-48d6-a29a-7d418c195aef\") " pod="openshift-multus/network-metrics-daemon-96m6d" Apr 23 08:48:13.670465 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:13.670410 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:48:13.670540 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:13.670480 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3e78a7f-f7cc-48d6-a29a-7d418c195aef-metrics-certs podName:f3e78a7f-f7cc-48d6-a29a-7d418c195aef nodeName:}" failed. No retries permitted until 2026-04-23 08:48:21.670454385 +0000 UTC m=+17.199934405 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f3e78a7f-f7cc-48d6-a29a-7d418c195aef-metrics-certs") pod "network-metrics-daemon-96m6d" (UID: "f3e78a7f-f7cc-48d6-a29a-7d418c195aef") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:48:13.771189 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:13.771111 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qfrd2\" (UniqueName: \"kubernetes.io/projected/bb54c9e2-2568-441d-a59a-ffa007afefd6-kube-api-access-qfrd2\") pod \"network-check-target-4mjb6\" (UID: \"bb54c9e2-2568-441d-a59a-ffa007afefd6\") " pod="openshift-network-diagnostics/network-check-target-4mjb6" Apr 23 08:48:13.771405 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:13.771306 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 08:48:13.771405 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:13.771334 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 08:48:13.771405 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:13.771365 2579 projected.go:194] Error preparing data for projected volume kube-api-access-qfrd2 for pod openshift-network-diagnostics/network-check-target-4mjb6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:48:13.771561 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:13.771429 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bb54c9e2-2568-441d-a59a-ffa007afefd6-kube-api-access-qfrd2 podName:bb54c9e2-2568-441d-a59a-ffa007afefd6 nodeName:}" failed. No retries permitted until 2026-04-23 08:48:21.771409652 +0000 UTC m=+17.300889659 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-qfrd2" (UniqueName: "kubernetes.io/projected/bb54c9e2-2568-441d-a59a-ffa007afefd6-kube-api-access-qfrd2") pod "network-check-target-4mjb6" (UID: "bb54c9e2-2568-441d-a59a-ffa007afefd6") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:48:14.011629 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:14.011548 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4mjb6" Apr 23 08:48:14.011775 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:14.011668 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4mjb6" podUID="bb54c9e2-2568-441d-a59a-ffa007afefd6" Apr 23 08:48:15.012927 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:15.012883 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-96m6d" Apr 23 08:48:15.013379 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:15.012987 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-96m6d" podUID="f3e78a7f-f7cc-48d6-a29a-7d418c195aef" Apr 23 08:48:15.013379 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:15.013358 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q6kd2" Apr 23 08:48:15.013505 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:15.013438 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q6kd2" podUID="f0ec8bf3-b2b1-4ced-9317-2706c95af066" Apr 23 08:48:16.012269 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:16.012232 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4mjb6" Apr 23 08:48:16.012446 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:16.012336 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4mjb6" podUID="bb54c9e2-2568-441d-a59a-ffa007afefd6" Apr 23 08:48:16.998049 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:16.998007 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f0ec8bf3-b2b1-4ced-9317-2706c95af066-original-pull-secret\") pod \"global-pull-secret-syncer-q6kd2\" (UID: \"f0ec8bf3-b2b1-4ced-9317-2706c95af066\") " pod="kube-system/global-pull-secret-syncer-q6kd2" Apr 23 08:48:16.998533 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:16.998164 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 08:48:16.998533 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:16.998242 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f0ec8bf3-b2b1-4ced-9317-2706c95af066-original-pull-secret podName:f0ec8bf3-b2b1-4ced-9317-2706c95af066 nodeName:}" failed. No retries permitted until 2026-04-23 08:48:24.998221857 +0000 UTC m=+20.527701875 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/f0ec8bf3-b2b1-4ced-9317-2706c95af066-original-pull-secret") pod "global-pull-secret-syncer-q6kd2" (UID: "f0ec8bf3-b2b1-4ced-9317-2706c95af066") : object "kube-system"/"original-pull-secret" not registered Apr 23 08:48:17.011523 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:17.011482 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-96m6d" Apr 23 08:48:17.011689 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:17.011526 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q6kd2" Apr 23 08:48:17.011689 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:17.011601 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-96m6d" podUID="f3e78a7f-f7cc-48d6-a29a-7d418c195aef" Apr 23 08:48:17.011799 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:17.011708 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q6kd2" podUID="f0ec8bf3-b2b1-4ced-9317-2706c95af066" Apr 23 08:48:18.012114 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:18.012077 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4mjb6" Apr 23 08:48:18.012565 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:18.012198 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4mjb6" podUID="bb54c9e2-2568-441d-a59a-ffa007afefd6" Apr 23 08:48:19.011992 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:19.011951 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q6kd2" Apr 23 08:48:19.012214 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:19.011952 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-96m6d" Apr 23 08:48:19.012214 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:19.012096 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q6kd2" podUID="f0ec8bf3-b2b1-4ced-9317-2706c95af066" Apr 23 08:48:19.012214 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:19.012201 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-96m6d" podUID="f3e78a7f-f7cc-48d6-a29a-7d418c195aef" Apr 23 08:48:20.011911 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:20.011886 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4mjb6" Apr 23 08:48:20.012030 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:20.011983 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4mjb6" podUID="bb54c9e2-2568-441d-a59a-ffa007afefd6" Apr 23 08:48:21.011368 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:21.011213 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-96m6d" Apr 23 08:48:21.011368 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:21.011210 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q6kd2" Apr 23 08:48:21.011932 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:21.011395 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-96m6d" podUID="f3e78a7f-f7cc-48d6-a29a-7d418c195aef" Apr 23 08:48:21.011932 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:21.011781 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q6kd2" podUID="f0ec8bf3-b2b1-4ced-9317-2706c95af066" Apr 23 08:48:21.731362 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:21.731304 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f3e78a7f-f7cc-48d6-a29a-7d418c195aef-metrics-certs\") pod \"network-metrics-daemon-96m6d\" (UID: \"f3e78a7f-f7cc-48d6-a29a-7d418c195aef\") " pod="openshift-multus/network-metrics-daemon-96m6d" Apr 23 08:48:21.731605 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:21.731505 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:48:21.731605 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:21.731588 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3e78a7f-f7cc-48d6-a29a-7d418c195aef-metrics-certs podName:f3e78a7f-f7cc-48d6-a29a-7d418c195aef nodeName:}" failed. No retries permitted until 2026-04-23 08:48:37.731567307 +0000 UTC m=+33.261047329 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f3e78a7f-f7cc-48d6-a29a-7d418c195aef-metrics-certs") pod "network-metrics-daemon-96m6d" (UID: "f3e78a7f-f7cc-48d6-a29a-7d418c195aef") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:48:21.832665 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:21.832624 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qfrd2\" (UniqueName: \"kubernetes.io/projected/bb54c9e2-2568-441d-a59a-ffa007afefd6-kube-api-access-qfrd2\") pod \"network-check-target-4mjb6\" (UID: \"bb54c9e2-2568-441d-a59a-ffa007afefd6\") " pod="openshift-network-diagnostics/network-check-target-4mjb6" Apr 23 08:48:21.832846 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:21.832767 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 08:48:21.832846 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:21.832788 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 08:48:21.832846 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:21.832804 2579 projected.go:194] Error preparing data for projected volume kube-api-access-qfrd2 for pod openshift-network-diagnostics/network-check-target-4mjb6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:48:21.832963 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:21.832862 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bb54c9e2-2568-441d-a59a-ffa007afefd6-kube-api-access-qfrd2 podName:bb54c9e2-2568-441d-a59a-ffa007afefd6 nodeName:}" failed. No retries permitted until 2026-04-23 08:48:37.832846448 +0000 UTC m=+33.362326453 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-qfrd2" (UniqueName: "kubernetes.io/projected/bb54c9e2-2568-441d-a59a-ffa007afefd6-kube-api-access-qfrd2") pod "network-check-target-4mjb6" (UID: "bb54c9e2-2568-441d-a59a-ffa007afefd6") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:48:22.011930 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:22.011845 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4mjb6" Apr 23 08:48:22.012362 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:22.011973 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4mjb6" podUID="bb54c9e2-2568-441d-a59a-ffa007afefd6" Apr 23 08:48:23.011558 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:23.011517 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-96m6d" Apr 23 08:48:23.011733 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:23.011532 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q6kd2" Apr 23 08:48:23.011733 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:23.011659 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-96m6d" podUID="f3e78a7f-f7cc-48d6-a29a-7d418c195aef" Apr 23 08:48:23.011817 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:23.011731 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q6kd2" podUID="f0ec8bf3-b2b1-4ced-9317-2706c95af066" Apr 23 08:48:24.012144 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:24.012107 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4mjb6" Apr 23 08:48:24.012652 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:24.012254 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4mjb6" podUID="bb54c9e2-2568-441d-a59a-ffa007afefd6" Apr 23 08:48:25.011756 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:25.011728 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-96m6d" Apr 23 08:48:25.011936 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:25.011858 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-96m6d" podUID="f3e78a7f-f7cc-48d6-a29a-7d418c195aef" Apr 23 08:48:25.013206 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:25.013189 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q6kd2" Apr 23 08:48:25.014126 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:25.014048 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q6kd2" podUID="f0ec8bf3-b2b1-4ced-9317-2706c95af066" Apr 23 08:48:25.060144 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:25.059966 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f0ec8bf3-b2b1-4ced-9317-2706c95af066-original-pull-secret\") pod \"global-pull-secret-syncer-q6kd2\" (UID: \"f0ec8bf3-b2b1-4ced-9317-2706c95af066\") " pod="kube-system/global-pull-secret-syncer-q6kd2" Apr 23 08:48:25.060144 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:25.060138 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 08:48:25.060291 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:25.060211 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f0ec8bf3-b2b1-4ced-9317-2706c95af066-original-pull-secret podName:f0ec8bf3-b2b1-4ced-9317-2706c95af066 nodeName:}" failed. No retries permitted until 2026-04-23 08:48:41.060188363 +0000 UTC m=+36.589668377 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/f0ec8bf3-b2b1-4ced-9317-2706c95af066-original-pull-secret") pod "global-pull-secret-syncer-q6kd2" (UID: "f0ec8bf3-b2b1-4ced-9317-2706c95af066") : object "kube-system"/"original-pull-secret" not registered Apr 23 08:48:25.152163 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:25.152121 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-h2vsr" event={"ID":"4b89f18f-6e25-4059-988c-27d5c1a39867","Type":"ContainerStarted","Data":"4441d0ac8a690dcbc8a12d6dc13ecfc8d5f7bd6c657dd695a59c405b4c0c9825"} Apr 23 08:48:25.158653 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:25.158614 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-47.ec2.internal" event={"ID":"8b712f12e316b1a6ded9d349ca82d37a","Type":"ContainerStarted","Data":"8883ac301579f61422a254bc45d9d3cd659e3b343a36b69299f358487842c15b"} Apr 23 08:48:25.161024 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:25.160990 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-9jfcj" event={"ID":"0991960b-d8b1-454f-a21e-8de493704ad2","Type":"ContainerStarted","Data":"b830e86117858fe63917e4f5ff5300b58315ec72ff337b5a97b8bb9cedbefc47"} Apr 23 08:48:25.163952 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:25.163931 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9bsnx_33cb4bff-3a7e-42a3-8d3d-e1c79d36437b/ovn-acl-logging/0.log" Apr 23 08:48:25.165027 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:25.164786 2579 generic.go:358] "Generic (PLEG): container finished" podID="33cb4bff-3a7e-42a3-8d3d-e1c79d36437b" containerID="f254518930934ac538b0f8d8d5f8dd3a9422119975a793564b7a7f4b65241c89" exitCode=1 Apr 23 08:48:25.165027 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:25.164842 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9bsnx" event={"ID":"33cb4bff-3a7e-42a3-8d3d-e1c79d36437b","Type":"ContainerStarted","Data":"a912fab5fa29a561710c2c51e49daae68ac68d20d0a6acc98605679333b67784"} Apr 23 08:48:25.165027 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:25.164866 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9bsnx" event={"ID":"33cb4bff-3a7e-42a3-8d3d-e1c79d36437b","Type":"ContainerStarted","Data":"f0cae87e806376f1c9710a9a2b1a5a53aff11884295d715bafe9f955a51a8b3c"} Apr 23 08:48:25.165027 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:25.164880 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9bsnx" event={"ID":"33cb4bff-3a7e-42a3-8d3d-e1c79d36437b","Type":"ContainerStarted","Data":"167107e2e8aab3237b9a13078ff4cac7088093b9442e29ed02bcc8027e858990"} Apr 23 08:48:25.165027 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:25.164893 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9bsnx" event={"ID":"33cb4bff-3a7e-42a3-8d3d-e1c79d36437b","Type":"ContainerDied","Data":"f254518930934ac538b0f8d8d5f8dd3a9422119975a793564b7a7f4b65241c89"} Apr 23 08:48:25.165027 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:25.164907 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9bsnx" event={"ID":"33cb4bff-3a7e-42a3-8d3d-e1c79d36437b","Type":"ContainerStarted","Data":"3b9ef0e7fe72b819a94fb080433905231237aae4286df3fcf138c363cf93161a"} Apr 23 08:48:25.169898 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:25.169844 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-h2vsr" podStartSLOduration=1.6633765390000002 podStartE2EDuration="20.169831258s" podCreationTimestamp="2026-04-23 08:48:05 +0000 UTC" firstStartedPulling="2026-04-23 08:48:06.306443232 +0000 UTC m=+1.835923237" lastFinishedPulling="2026-04-23 08:48:24.81289795 +0000 UTC m=+20.342377956" observedRunningTime="2026-04-23 08:48:25.169281378 +0000 UTC m=+20.698761404" watchObservedRunningTime="2026-04-23 08:48:25.169831258 +0000 UTC m=+20.699311284" Apr 23 08:48:25.182119 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:25.181610 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-47.ec2.internal" podStartSLOduration=20.1815943 podStartE2EDuration="20.1815943s" podCreationTimestamp="2026-04-23 08:48:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 08:48:25.181117763 +0000 UTC m=+20.710597791" watchObservedRunningTime="2026-04-23 08:48:25.1815943 +0000 UTC m=+20.711074327" Apr 23 08:48:25.197599 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:25.197536 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-9jfcj" podStartSLOduration=2.084096396 podStartE2EDuration="20.19751722s" podCreationTimestamp="2026-04-23 08:48:05 +0000 UTC" firstStartedPulling="2026-04-23 08:48:06.314261228 +0000 UTC m=+1.843741234" lastFinishedPulling="2026-04-23 08:48:24.427682051 +0000 UTC m=+19.957162058" observedRunningTime="2026-04-23 08:48:25.197152052 +0000 UTC m=+20.726632079" watchObservedRunningTime="2026-04-23 08:48:25.19751722 +0000 UTC m=+20.726997247" Apr 23 08:48:26.012123 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:26.011706 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4mjb6" Apr 23 08:48:26.012123 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:26.011823 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4mjb6" podUID="bb54c9e2-2568-441d-a59a-ffa007afefd6" Apr 23 08:48:26.168631 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:26.168596 2579 generic.go:358] "Generic (PLEG): container finished" podID="438108d93a37ab59c6a0c9e57eee327c" containerID="a3ed7aeb086c1247fd8fbca207819c903f52d72ce8845a172ed99ab9c4065fe7" exitCode=0 Apr 23 08:48:26.169140 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:26.168704 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-47.ec2.internal" event={"ID":"438108d93a37ab59c6a0c9e57eee327c","Type":"ContainerDied","Data":"a3ed7aeb086c1247fd8fbca207819c903f52d72ce8845a172ed99ab9c4065fe7"} Apr 23 08:48:26.169140 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:26.168925 2579 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-47.ec2.internal" Apr 23 08:48:26.170130 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:26.170103 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-z6xdx" event={"ID":"a0aeb874-3a11-4639-a579-36512ba94069","Type":"ContainerStarted","Data":"7b72745ac27f59544f14650803efa6c0baf13bb94eee940c5d8ede0a979ec033"} Apr 23 08:48:26.171958 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:26.171614 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-6hwnm" event={"ID":"52011e07-e36d-48b0-bed0-421685c0e544","Type":"ContainerStarted","Data":"b9d255752f79e64c082e3384788cbd85817ac25f8f4f1e3c5713e520ab69e622"} Apr 23 08:48:26.173093 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:26.173072 2579 generic.go:358] "Generic (PLEG): container finished" podID="8e6d8633-b9de-4e37-96e7-465a09675f90" containerID="530df24ca4ee767a75a4a7376342a823b8773cdbd66aef2de085dd5daa3e021d" exitCode=0 Apr 23 08:48:26.173162 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:26.173132 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jwnqb" event={"ID":"8e6d8633-b9de-4e37-96e7-465a09675f90","Type":"ContainerDied","Data":"530df24ca4ee767a75a4a7376342a823b8773cdbd66aef2de085dd5daa3e021d"} Apr 23 08:48:26.176869 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:26.176846 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9bsnx_33cb4bff-3a7e-42a3-8d3d-e1c79d36437b/ovn-acl-logging/0.log" Apr 23 08:48:26.177275 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:26.177252 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9bsnx" event={"ID":"33cb4bff-3a7e-42a3-8d3d-e1c79d36437b","Type":"ContainerStarted","Data":"e92e931e6b2b62fa0a83a7539a54bde42a32de861e2f78e1dbe98943f6a50c3f"} Apr 23 08:48:26.178776 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:26.178754 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-zf6q5" event={"ID":"83905907-ce4f-4b05-a10c-1a78044f595e","Type":"ContainerStarted","Data":"e40ea15872b6a0588f385ad5b79ffd51aafe7d9ba8d06951771a033feeeb2234"} Apr 23 08:48:26.178949 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:26.178934 2579 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 08:48:26.179714 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:26.179693 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-47.ec2.internal"] Apr 23 08:48:26.180314 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:26.180288 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jx26p" event={"ID":"8963f0bc-78e1-4f93-a9a9-bba51f04c437","Type":"ContainerStarted","Data":"335e283adb9a90b8244847678bbee226a0993bc90e3d861e230b8f0ada4319c8"} Apr 23 08:48:26.181648 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:26.181625 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-c68x8" event={"ID":"6d76e4ef-b56a-40fd-9c37-6eb55602fea4","Type":"ContainerStarted","Data":"f273d7778415c2f73c97d75cd18827938c95fdda8a048203f6e9aed4496fcafc"} Apr 23 08:48:26.184729 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:26.184690 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-z6xdx" podStartSLOduration=3.084480004 podStartE2EDuration="21.184678342s" podCreationTimestamp="2026-04-23 08:48:05 +0000 UTC" firstStartedPulling="2026-04-23 08:48:06.278505421 +0000 UTC m=+1.807985429" lastFinishedPulling="2026-04-23 08:48:24.378703763 +0000 UTC m=+19.908183767" observedRunningTime="2026-04-23 08:48:26.184137103 +0000 UTC m=+21.713617128" watchObservedRunningTime="2026-04-23 08:48:26.184678342 +0000 UTC m=+21.714158396" Apr 23 08:48:26.197186 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:26.197137 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-6hwnm" podStartSLOduration=3.080643759 podStartE2EDuration="21.197121165s" podCreationTimestamp="2026-04-23 08:48:05 +0000 UTC" firstStartedPulling="2026-04-23 08:48:06.262231827 +0000 UTC m=+1.791711830" lastFinishedPulling="2026-04-23 08:48:24.378709232 +0000 UTC m=+19.908189236" observedRunningTime="2026-04-23 08:48:26.197010052 +0000 UTC m=+21.726490077" watchObservedRunningTime="2026-04-23 08:48:26.197121165 +0000 UTC m=+21.726601190" Apr 23 08:48:26.225763 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:26.225696 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-c68x8" podStartSLOduration=3.01173769 podStartE2EDuration="21.225674959s" podCreationTimestamp="2026-04-23 08:48:05 +0000 UTC" firstStartedPulling="2026-04-23 08:48:06.212739031 +0000 UTC m=+1.742219038" lastFinishedPulling="2026-04-23 08:48:24.4266763 +0000 UTC m=+19.956156307" observedRunningTime="2026-04-23 08:48:26.224848521 +0000 UTC m=+21.754328581" watchObservedRunningTime="2026-04-23 08:48:26.225674959 +0000 UTC m=+21.755154992" Apr 23 08:48:26.239769 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:26.239721 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-zf6q5" podStartSLOduration=3.118398291 podStartE2EDuration="21.239705872s" podCreationTimestamp="2026-04-23 08:48:05 +0000 UTC" firstStartedPulling="2026-04-23 08:48:06.257592587 +0000 UTC m=+1.787072592" lastFinishedPulling="2026-04-23 08:48:24.37890017 +0000 UTC m=+19.908380173" observedRunningTime="2026-04-23 08:48:26.239360218 +0000 UTC m=+21.768840246" watchObservedRunningTime="2026-04-23 08:48:26.239705872 +0000 UTC m=+21.769185897" Apr 23 08:48:26.430654 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:26.430491 2579 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 23 08:48:26.953066 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:26.952926 2579 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-23T08:48:26.430649581Z","UUID":"83cf1dd4-1143-4437-869d-3cc5a4bcc681","Handler":null,"Name":"","Endpoint":""} Apr 23 08:48:26.955734 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:26.955708 2579 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 23 08:48:26.955734 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:26.955741 2579 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 23 08:48:27.013190 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:27.013059 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q6kd2" Apr 23 08:48:27.013190 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:27.013187 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q6kd2" podUID="f0ec8bf3-b2b1-4ced-9317-2706c95af066" Apr 23 08:48:27.013419 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:27.013226 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-96m6d" Apr 23 08:48:27.013419 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:27.013289 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-96m6d" podUID="f3e78a7f-f7cc-48d6-a29a-7d418c195aef" Apr 23 08:48:27.186166 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:27.186127 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jx26p" event={"ID":"8963f0bc-78e1-4f93-a9a9-bba51f04c437","Type":"ContainerStarted","Data":"85c35bd167f95c3ee3ee115844910c92e47024f022297d39e878c16113932206"} Apr 23 08:48:27.189241 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:27.189212 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-47.ec2.internal" event={"ID":"438108d93a37ab59c6a0c9e57eee327c","Type":"ContainerStarted","Data":"f20e90c5c17681fdc1d6cac07c8a4856e33199b07e785ca99dfdda78a95cbd40"} Apr 23 08:48:27.203283 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:27.203183 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-47.ec2.internal" podStartSLOduration=1.203168618 podStartE2EDuration="1.203168618s" podCreationTimestamp="2026-04-23 08:48:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 08:48:27.202770523 +0000 UTC m=+22.732250548" watchObservedRunningTime="2026-04-23 08:48:27.203168618 +0000 UTC m=+22.732648643" Apr 23 08:48:28.011781 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:28.011749 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4mjb6" Apr 23 08:48:28.011960 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:28.011864 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4mjb6" podUID="bb54c9e2-2568-441d-a59a-ffa007afefd6" Apr 23 08:48:28.192551 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:28.192511 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jx26p" event={"ID":"8963f0bc-78e1-4f93-a9a9-bba51f04c437","Type":"ContainerStarted","Data":"7be0770a3e400fd907bf0c0514d6750de65d034e80e62d2cc984399504ffb701"} Apr 23 08:48:28.195510 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:28.195487 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9bsnx_33cb4bff-3a7e-42a3-8d3d-e1c79d36437b/ovn-acl-logging/0.log" Apr 23 08:48:28.195832 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:28.195804 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9bsnx" event={"ID":"33cb4bff-3a7e-42a3-8d3d-e1c79d36437b","Type":"ContainerStarted","Data":"4c809e51186f205f6e4bc7d08caa45e91786ed05b5045f35993874313e200a57"} Apr 23 08:48:28.210425 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:28.210358 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jx26p" podStartSLOduration=2.1308669670000002 podStartE2EDuration="23.210321369s" podCreationTimestamp="2026-04-23 08:48:05 +0000 UTC" firstStartedPulling="2026-04-23 08:48:06.244212859 +0000 UTC m=+1.773692878" lastFinishedPulling="2026-04-23 08:48:27.323667251 +0000 UTC m=+22.853147280" observedRunningTime="2026-04-23 08:48:28.209327268 +0000 UTC m=+23.738807293" watchObservedRunningTime="2026-04-23 08:48:28.210321369 +0000 UTC m=+23.739801397" Apr 23 08:48:29.011454 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:29.011418 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q6kd2" Apr 23 08:48:29.011643 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:29.011457 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-96m6d" Apr 23 08:48:29.011643 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:29.011568 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q6kd2" podUID="f0ec8bf3-b2b1-4ced-9317-2706c95af066" Apr 23 08:48:29.011801 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:29.011780 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-96m6d" podUID="f3e78a7f-f7cc-48d6-a29a-7d418c195aef" Apr 23 08:48:30.012269 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:30.012234 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4mjb6" Apr 23 08:48:30.012848 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:30.012387 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4mjb6" podUID="bb54c9e2-2568-441d-a59a-ffa007afefd6" Apr 23 08:48:30.842565 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:30.842362 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-6hwnm" Apr 23 08:48:30.842978 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:30.842958 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-6hwnm" Apr 23 08:48:31.011538 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:31.011502 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q6kd2" Apr 23 08:48:31.011706 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:31.011505 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-96m6d" Apr 23 08:48:31.011760 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:31.011702 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-96m6d" podUID="f3e78a7f-f7cc-48d6-a29a-7d418c195aef" Apr 23 08:48:31.011760 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:31.011597 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q6kd2" podUID="f0ec8bf3-b2b1-4ced-9317-2706c95af066" Apr 23 08:48:31.203763 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:31.203722 2579 generic.go:358] "Generic (PLEG): container finished" podID="8e6d8633-b9de-4e37-96e7-465a09675f90" containerID="9a3814ed371aa8fd684379a3a5827beff8cd3e7f22e2c7a1ba6812409c042001" exitCode=0 Apr 23 08:48:31.204198 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:31.203808 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jwnqb" event={"ID":"8e6d8633-b9de-4e37-96e7-465a09675f90","Type":"ContainerDied","Data":"9a3814ed371aa8fd684379a3a5827beff8cd3e7f22e2c7a1ba6812409c042001"} Apr 23 08:48:31.207149 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:31.207122 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9bsnx_33cb4bff-3a7e-42a3-8d3d-e1c79d36437b/ovn-acl-logging/0.log" Apr 23 08:48:31.207566 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:31.207539 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9bsnx" event={"ID":"33cb4bff-3a7e-42a3-8d3d-e1c79d36437b","Type":"ContainerStarted","Data":"05852ea1289344db20b37b127157145b0ed10bc75ab40aa2f6f65ff4fae70fee"} Apr 23 08:48:31.207980 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:31.207918 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-9bsnx" Apr 23 08:48:31.207980 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:31.207948 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-6hwnm" Apr 23 08:48:31.207980 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:31.207962 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-9bsnx" Apr 23 08:48:31.208131 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:31.208015 2579 scope.go:117] "RemoveContainer" containerID="f254518930934ac538b0f8d8d5f8dd3a9422119975a793564b7a7f4b65241c89" Apr 23 08:48:31.208459 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:31.208440 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-6hwnm" Apr 23 08:48:31.223841 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:31.223815 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9bsnx" Apr 23 08:48:32.011818 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:32.011634 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4mjb6" Apr 23 08:48:32.011976 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:32.011898 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4mjb6" podUID="bb54c9e2-2568-441d-a59a-ffa007afefd6" Apr 23 08:48:32.211111 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:32.211077 2579 generic.go:358] "Generic (PLEG): container finished" podID="8e6d8633-b9de-4e37-96e7-465a09675f90" containerID="ece7dc3dd9f916322a72d3811f18eea04a19dd20617939a03ae0da3022a6322e" exitCode=0 Apr 23 08:48:32.211745 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:32.211161 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jwnqb" event={"ID":"8e6d8633-b9de-4e37-96e7-465a09675f90","Type":"ContainerDied","Data":"ece7dc3dd9f916322a72d3811f18eea04a19dd20617939a03ae0da3022a6322e"} Apr 23 08:48:32.214402 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:32.214379 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9bsnx_33cb4bff-3a7e-42a3-8d3d-e1c79d36437b/ovn-acl-logging/0.log" Apr 23 08:48:32.214783 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:32.214759 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9bsnx" event={"ID":"33cb4bff-3a7e-42a3-8d3d-e1c79d36437b","Type":"ContainerStarted","Data":"b4c02bcc880ed1a93ec75456fe4e3286d67493c77edfa92d606e968b0ff3e58f"} Apr 23 08:48:32.215087 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:32.215058 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-9bsnx" Apr 23 08:48:32.230217 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:32.230135 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9bsnx" Apr 23 08:48:32.258842 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:32.258788 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-9bsnx" podStartSLOduration=8.870619382 podStartE2EDuration="27.258773602s" podCreationTimestamp="2026-04-23 08:48:05 +0000 UTC" firstStartedPulling="2026-04-23 08:48:06.312053609 +0000 UTC m=+1.841533613" lastFinishedPulling="2026-04-23 08:48:24.700207829 +0000 UTC m=+20.229687833" observedRunningTime="2026-04-23 08:48:32.258070724 +0000 UTC m=+27.787550750" watchObservedRunningTime="2026-04-23 08:48:32.258773602 +0000 UTC m=+27.788253643" Apr 23 08:48:32.360967 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:32.360930 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-4mjb6"] Apr 23 08:48:32.361128 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:32.361048 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4mjb6" Apr 23 08:48:32.361174 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:32.361142 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4mjb6" podUID="bb54c9e2-2568-441d-a59a-ffa007afefd6" Apr 23 08:48:32.361525 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:32.361499 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-q6kd2"] Apr 23 08:48:32.361616 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:32.361601 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q6kd2" Apr 23 08:48:32.361696 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:32.361676 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q6kd2" podUID="f0ec8bf3-b2b1-4ced-9317-2706c95af066" Apr 23 08:48:32.363013 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:32.362991 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-96m6d"] Apr 23 08:48:32.363099 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:32.363088 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-96m6d" Apr 23 08:48:32.363265 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:32.363244 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-96m6d" podUID="f3e78a7f-f7cc-48d6-a29a-7d418c195aef" Apr 23 08:48:33.218317 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:33.218286 2579 generic.go:358] "Generic (PLEG): container finished" podID="8e6d8633-b9de-4e37-96e7-465a09675f90" containerID="7e9a93109690dc541385e773513e56732dc916a2a520f1590a8970049d72279c" exitCode=0 Apr 23 08:48:33.218715 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:33.218384 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jwnqb" event={"ID":"8e6d8633-b9de-4e37-96e7-465a09675f90","Type":"ContainerDied","Data":"7e9a93109690dc541385e773513e56732dc916a2a520f1590a8970049d72279c"} Apr 23 08:48:34.011805 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:34.011768 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q6kd2" Apr 23 08:48:34.011805 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:34.011797 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4mjb6" Apr 23 08:48:34.012008 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:34.011816 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-96m6d" Apr 23 08:48:34.012008 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:34.011900 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q6kd2" podUID="f0ec8bf3-b2b1-4ced-9317-2706c95af066" Apr 23 08:48:34.012121 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:34.012017 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-96m6d" podUID="f3e78a7f-f7cc-48d6-a29a-7d418c195aef" Apr 23 08:48:34.012121 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:34.012081 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4mjb6" podUID="bb54c9e2-2568-441d-a59a-ffa007afefd6" Apr 23 08:48:36.011852 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:36.011646 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4mjb6" Apr 23 08:48:36.012387 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:36.011648 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q6kd2" Apr 23 08:48:36.012387 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:36.011921 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4mjb6" podUID="bb54c9e2-2568-441d-a59a-ffa007afefd6" Apr 23 08:48:36.012387 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:36.012020 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q6kd2" podUID="f0ec8bf3-b2b1-4ced-9317-2706c95af066" Apr 23 08:48:36.012387 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:36.011714 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-96m6d" Apr 23 08:48:36.012387 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:36.012147 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-96m6d" podUID="f3e78a7f-f7cc-48d6-a29a-7d418c195aef" Apr 23 08:48:37.760004 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:37.759880 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f3e78a7f-f7cc-48d6-a29a-7d418c195aef-metrics-certs\") pod \"network-metrics-daemon-96m6d\" (UID: \"f3e78a7f-f7cc-48d6-a29a-7d418c195aef\") " pod="openshift-multus/network-metrics-daemon-96m6d" Apr 23 08:48:37.760451 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:37.760079 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:48:37.760451 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:37.760165 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3e78a7f-f7cc-48d6-a29a-7d418c195aef-metrics-certs podName:f3e78a7f-f7cc-48d6-a29a-7d418c195aef nodeName:}" failed. No retries permitted until 2026-04-23 08:49:09.760144114 +0000 UTC m=+65.289624137 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f3e78a7f-f7cc-48d6-a29a-7d418c195aef-metrics-certs") pod "network-metrics-daemon-96m6d" (UID: "f3e78a7f-f7cc-48d6-a29a-7d418c195aef") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:48:37.819105 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:37.819074 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-47.ec2.internal" event="NodeReady" Apr 23 08:48:37.819301 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:37.819237 2579 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 23 08:48:37.861051 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:37.860946 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qfrd2\" (UniqueName: \"kubernetes.io/projected/bb54c9e2-2568-441d-a59a-ffa007afefd6-kube-api-access-qfrd2\") pod \"network-check-target-4mjb6\" (UID: \"bb54c9e2-2568-441d-a59a-ffa007afefd6\") " pod="openshift-network-diagnostics/network-check-target-4mjb6" Apr 23 08:48:37.861218 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:37.861148 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 08:48:37.861218 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:37.861176 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 08:48:37.861218 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:37.861190 2579 projected.go:194] Error preparing data for projected volume kube-api-access-qfrd2 for pod openshift-network-diagnostics/network-check-target-4mjb6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:48:37.861416 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:37.861284 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bb54c9e2-2568-441d-a59a-ffa007afefd6-kube-api-access-qfrd2 podName:bb54c9e2-2568-441d-a59a-ffa007afefd6 nodeName:}" failed. No retries permitted until 2026-04-23 08:49:09.861261952 +0000 UTC m=+65.390741978 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-qfrd2" (UniqueName: "kubernetes.io/projected/bb54c9e2-2568-441d-a59a-ffa007afefd6-kube-api-access-qfrd2") pod "network-check-target-4mjb6" (UID: "bb54c9e2-2568-441d-a59a-ffa007afefd6") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:48:37.884752 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:37.884160 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-sr575"] Apr 23 08:48:37.912290 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:37.912258 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-dd2cl"] Apr 23 08:48:37.912487 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:37.912466 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-sr575" Apr 23 08:48:37.914687 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:37.914665 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 23 08:48:37.914967 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:37.914946 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 23 08:48:37.915191 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:37.915176 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-hskmq\"" Apr 23 08:48:37.927310 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:37.927289 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-sr575"] Apr 23 08:48:37.927450 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:37.927328 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-dd2cl"] Apr 23 08:48:37.927450 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:37.927444 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-dd2cl" Apr 23 08:48:37.929569 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:37.929550 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 23 08:48:37.929685 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:37.929611 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 23 08:48:37.929685 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:37.929645 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 23 08:48:37.929685 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:37.929671 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-55hsm\"" Apr 23 08:48:38.011318 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:38.011231 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-96m6d" Apr 23 08:48:38.011486 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:38.011231 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q6kd2" Apr 23 08:48:38.011486 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:38.011231 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4mjb6" Apr 23 08:48:38.014997 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:38.014966 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 23 08:48:38.015146 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:38.015121 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 23 08:48:38.015238 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:38.015214 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-zffjs\"" Apr 23 08:48:38.016534 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:38.016515 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 23 08:48:38.016657 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:38.016563 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-h75n2\"" Apr 23 08:48:38.016657 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:38.016518 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 23 08:48:38.063078 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:38.063051 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/75c84a62-de6b-4301-9660-8d6ba1422a31-cert\") pod \"ingress-canary-dd2cl\" (UID: \"75c84a62-de6b-4301-9660-8d6ba1422a31\") " pod="openshift-ingress-canary/ingress-canary-dd2cl" Apr 23 08:48:38.063234 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:38.063085 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/533a84b7-1e94-4312-8393-c6c787af53b6-metrics-tls\") pod \"dns-default-sr575\" (UID: \"533a84b7-1e94-4312-8393-c6c787af53b6\") " pod="openshift-dns/dns-default-sr575" Apr 23 08:48:38.063234 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:38.063109 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/533a84b7-1e94-4312-8393-c6c787af53b6-tmp-dir\") pod \"dns-default-sr575\" (UID: \"533a84b7-1e94-4312-8393-c6c787af53b6\") " pod="openshift-dns/dns-default-sr575" Apr 23 08:48:38.063234 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:38.063152 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2jhv\" (UniqueName: \"kubernetes.io/projected/533a84b7-1e94-4312-8393-c6c787af53b6-kube-api-access-m2jhv\") pod \"dns-default-sr575\" (UID: \"533a84b7-1e94-4312-8393-c6c787af53b6\") " pod="openshift-dns/dns-default-sr575" Apr 23 08:48:38.063234 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:38.063196 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpvpk\" (UniqueName: \"kubernetes.io/projected/75c84a62-de6b-4301-9660-8d6ba1422a31-kube-api-access-tpvpk\") pod \"ingress-canary-dd2cl\" (UID: \"75c84a62-de6b-4301-9660-8d6ba1422a31\") " pod="openshift-ingress-canary/ingress-canary-dd2cl" Apr 23 08:48:38.063465 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:38.063235 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/533a84b7-1e94-4312-8393-c6c787af53b6-config-volume\") pod \"dns-default-sr575\" (UID: \"533a84b7-1e94-4312-8393-c6c787af53b6\") " pod="openshift-dns/dns-default-sr575" Apr 23 08:48:38.164512 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:38.164459 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/75c84a62-de6b-4301-9660-8d6ba1422a31-cert\") pod \"ingress-canary-dd2cl\" (UID: \"75c84a62-de6b-4301-9660-8d6ba1422a31\") " pod="openshift-ingress-canary/ingress-canary-dd2cl" Apr 23 08:48:38.164710 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:38.164526 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/533a84b7-1e94-4312-8393-c6c787af53b6-metrics-tls\") pod \"dns-default-sr575\" (UID: \"533a84b7-1e94-4312-8393-c6c787af53b6\") " pod="openshift-dns/dns-default-sr575" Apr 23 08:48:38.164710 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:38.164562 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/533a84b7-1e94-4312-8393-c6c787af53b6-tmp-dir\") pod \"dns-default-sr575\" (UID: \"533a84b7-1e94-4312-8393-c6c787af53b6\") " pod="openshift-dns/dns-default-sr575" Apr 23 08:48:38.164710 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:38.164588 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 08:48:38.164710 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:38.164597 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m2jhv\" (UniqueName: \"kubernetes.io/projected/533a84b7-1e94-4312-8393-c6c787af53b6-kube-api-access-m2jhv\") pod \"dns-default-sr575\" (UID: \"533a84b7-1e94-4312-8393-c6c787af53b6\") " pod="openshift-dns/dns-default-sr575" Apr 23 08:48:38.164710 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:38.164624 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tpvpk\" (UniqueName: \"kubernetes.io/projected/75c84a62-de6b-4301-9660-8d6ba1422a31-kube-api-access-tpvpk\") pod \"ingress-canary-dd2cl\" (UID: \"75c84a62-de6b-4301-9660-8d6ba1422a31\") " pod="openshift-ingress-canary/ingress-canary-dd2cl" Apr 23 08:48:38.164710 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:38.164663 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75c84a62-de6b-4301-9660-8d6ba1422a31-cert podName:75c84a62-de6b-4301-9660-8d6ba1422a31 nodeName:}" failed. No retries permitted until 2026-04-23 08:48:38.664641807 +0000 UTC m=+34.194121826 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/75c84a62-de6b-4301-9660-8d6ba1422a31-cert") pod "ingress-canary-dd2cl" (UID: "75c84a62-de6b-4301-9660-8d6ba1422a31") : secret "canary-serving-cert" not found Apr 23 08:48:38.164710 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:38.164690 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/533a84b7-1e94-4312-8393-c6c787af53b6-config-volume\") pod \"dns-default-sr575\" (UID: \"533a84b7-1e94-4312-8393-c6c787af53b6\") " pod="openshift-dns/dns-default-sr575" Apr 23 08:48:38.165174 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:38.165135 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 08:48:38.165257 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:38.165194 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/533a84b7-1e94-4312-8393-c6c787af53b6-metrics-tls podName:533a84b7-1e94-4312-8393-c6c787af53b6 nodeName:}" failed. No retries permitted until 2026-04-23 08:48:38.665176528 +0000 UTC m=+34.194656535 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/533a84b7-1e94-4312-8393-c6c787af53b6-metrics-tls") pod "dns-default-sr575" (UID: "533a84b7-1e94-4312-8393-c6c787af53b6") : secret "dns-default-metrics-tls" not found Apr 23 08:48:38.165257 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:38.165237 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/533a84b7-1e94-4312-8393-c6c787af53b6-tmp-dir\") pod \"dns-default-sr575\" (UID: \"533a84b7-1e94-4312-8393-c6c787af53b6\") " pod="openshift-dns/dns-default-sr575" Apr 23 08:48:38.165386 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:38.165324 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/533a84b7-1e94-4312-8393-c6c787af53b6-config-volume\") pod \"dns-default-sr575\" (UID: \"533a84b7-1e94-4312-8393-c6c787af53b6\") " pod="openshift-dns/dns-default-sr575" Apr 23 08:48:38.175889 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:38.175861 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2jhv\" (UniqueName: \"kubernetes.io/projected/533a84b7-1e94-4312-8393-c6c787af53b6-kube-api-access-m2jhv\") pod \"dns-default-sr575\" (UID: \"533a84b7-1e94-4312-8393-c6c787af53b6\") " pod="openshift-dns/dns-default-sr575" Apr 23 08:48:38.176130 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:38.176109 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpvpk\" (UniqueName: \"kubernetes.io/projected/75c84a62-de6b-4301-9660-8d6ba1422a31-kube-api-access-tpvpk\") pod \"ingress-canary-dd2cl\" (UID: \"75c84a62-de6b-4301-9660-8d6ba1422a31\") " pod="openshift-ingress-canary/ingress-canary-dd2cl" Apr 23 08:48:38.668852 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:38.668822 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/75c84a62-de6b-4301-9660-8d6ba1422a31-cert\") pod \"ingress-canary-dd2cl\" (UID: \"75c84a62-de6b-4301-9660-8d6ba1422a31\") " pod="openshift-ingress-canary/ingress-canary-dd2cl" Apr 23 08:48:38.669056 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:38.668864 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/533a84b7-1e94-4312-8393-c6c787af53b6-metrics-tls\") pod \"dns-default-sr575\" (UID: \"533a84b7-1e94-4312-8393-c6c787af53b6\") " pod="openshift-dns/dns-default-sr575" Apr 23 08:48:38.669056 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:38.668982 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 08:48:38.669056 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:38.668983 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 08:48:38.669056 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:38.669047 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75c84a62-de6b-4301-9660-8d6ba1422a31-cert podName:75c84a62-de6b-4301-9660-8d6ba1422a31 nodeName:}" failed. No retries permitted until 2026-04-23 08:48:39.669029228 +0000 UTC m=+35.198509236 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/75c84a62-de6b-4301-9660-8d6ba1422a31-cert") pod "ingress-canary-dd2cl" (UID: "75c84a62-de6b-4301-9660-8d6ba1422a31") : secret "canary-serving-cert" not found Apr 23 08:48:38.669241 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:38.669066 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/533a84b7-1e94-4312-8393-c6c787af53b6-metrics-tls podName:533a84b7-1e94-4312-8393-c6c787af53b6 nodeName:}" failed. No retries permitted until 2026-04-23 08:48:39.669058132 +0000 UTC m=+35.198538135 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/533a84b7-1e94-4312-8393-c6c787af53b6-metrics-tls") pod "dns-default-sr575" (UID: "533a84b7-1e94-4312-8393-c6c787af53b6") : secret "dns-default-metrics-tls" not found Apr 23 08:48:39.677760 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:39.677719 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/75c84a62-de6b-4301-9660-8d6ba1422a31-cert\") pod \"ingress-canary-dd2cl\" (UID: \"75c84a62-de6b-4301-9660-8d6ba1422a31\") " pod="openshift-ingress-canary/ingress-canary-dd2cl" Apr 23 08:48:39.677760 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:39.677760 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/533a84b7-1e94-4312-8393-c6c787af53b6-metrics-tls\") pod \"dns-default-sr575\" (UID: \"533a84b7-1e94-4312-8393-c6c787af53b6\") " pod="openshift-dns/dns-default-sr575" Apr 23 08:48:39.678274 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:39.677861 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 08:48:39.678274 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:39.677926 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75c84a62-de6b-4301-9660-8d6ba1422a31-cert podName:75c84a62-de6b-4301-9660-8d6ba1422a31 nodeName:}" failed. No retries permitted until 2026-04-23 08:48:41.677908394 +0000 UTC m=+37.207388398 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/75c84a62-de6b-4301-9660-8d6ba1422a31-cert") pod "ingress-canary-dd2cl" (UID: "75c84a62-de6b-4301-9660-8d6ba1422a31") : secret "canary-serving-cert" not found Apr 23 08:48:39.678274 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:39.677864 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 08:48:39.678274 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:39.678002 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/533a84b7-1e94-4312-8393-c6c787af53b6-metrics-tls podName:533a84b7-1e94-4312-8393-c6c787af53b6 nodeName:}" failed. No retries permitted until 2026-04-23 08:48:41.677989872 +0000 UTC m=+37.207469876 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/533a84b7-1e94-4312-8393-c6c787af53b6-metrics-tls") pod "dns-default-sr575" (UID: "533a84b7-1e94-4312-8393-c6c787af53b6") : secret "dns-default-metrics-tls" not found Apr 23 08:48:40.233465 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:40.233429 2579 generic.go:358] "Generic (PLEG): container finished" podID="8e6d8633-b9de-4e37-96e7-465a09675f90" containerID="b8432889d12f70d946366f0ccb737c6f6d0693e109b177be21c9c59bfa52b634" exitCode=0 Apr 23 08:48:40.233641 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:40.233477 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jwnqb" event={"ID":"8e6d8633-b9de-4e37-96e7-465a09675f90","Type":"ContainerDied","Data":"b8432889d12f70d946366f0ccb737c6f6d0693e109b177be21c9c59bfa52b634"} Apr 23 08:48:41.087042 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:41.087009 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f0ec8bf3-b2b1-4ced-9317-2706c95af066-original-pull-secret\") pod \"global-pull-secret-syncer-q6kd2\" (UID: \"f0ec8bf3-b2b1-4ced-9317-2706c95af066\") " pod="kube-system/global-pull-secret-syncer-q6kd2" Apr 23 08:48:41.096436 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:41.096409 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f0ec8bf3-b2b1-4ced-9317-2706c95af066-original-pull-secret\") pod \"global-pull-secret-syncer-q6kd2\" (UID: \"f0ec8bf3-b2b1-4ced-9317-2706c95af066\") " pod="kube-system/global-pull-secret-syncer-q6kd2" Apr 23 08:48:41.237878 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:41.237849 2579 generic.go:358] "Generic (PLEG): container finished" podID="8e6d8633-b9de-4e37-96e7-465a09675f90" containerID="6693bb5751cc1acc1dda5e0e7b3e2ad41011f209ae2a2c80d940a2ab9d5d30b9" exitCode=0 Apr 23 08:48:41.238018 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:41.237905 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jwnqb" event={"ID":"8e6d8633-b9de-4e37-96e7-465a09675f90","Type":"ContainerDied","Data":"6693bb5751cc1acc1dda5e0e7b3e2ad41011f209ae2a2c80d940a2ab9d5d30b9"} Apr 23 08:48:41.332041 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:41.331883 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q6kd2" Apr 23 08:48:41.489825 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:41.489792 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-q6kd2"] Apr 23 08:48:41.693694 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:41.693661 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/75c84a62-de6b-4301-9660-8d6ba1422a31-cert\") pod \"ingress-canary-dd2cl\" (UID: \"75c84a62-de6b-4301-9660-8d6ba1422a31\") " pod="openshift-ingress-canary/ingress-canary-dd2cl" Apr 23 08:48:41.693847 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:41.693698 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/533a84b7-1e94-4312-8393-c6c787af53b6-metrics-tls\") pod \"dns-default-sr575\" (UID: \"533a84b7-1e94-4312-8393-c6c787af53b6\") " pod="openshift-dns/dns-default-sr575" Apr 23 08:48:41.693847 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:41.693815 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 08:48:41.693847 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:41.693815 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 08:48:41.693939 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:41.693883 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75c84a62-de6b-4301-9660-8d6ba1422a31-cert podName:75c84a62-de6b-4301-9660-8d6ba1422a31 nodeName:}" failed. No retries permitted until 2026-04-23 08:48:45.693866828 +0000 UTC m=+41.223346831 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/75c84a62-de6b-4301-9660-8d6ba1422a31-cert") pod "ingress-canary-dd2cl" (UID: "75c84a62-de6b-4301-9660-8d6ba1422a31") : secret "canary-serving-cert" not found Apr 23 08:48:41.693939 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:41.693897 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/533a84b7-1e94-4312-8393-c6c787af53b6-metrics-tls podName:533a84b7-1e94-4312-8393-c6c787af53b6 nodeName:}" failed. No retries permitted until 2026-04-23 08:48:45.693890954 +0000 UTC m=+41.223370958 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/533a84b7-1e94-4312-8393-c6c787af53b6-metrics-tls") pod "dns-default-sr575" (UID: "533a84b7-1e94-4312-8393-c6c787af53b6") : secret "dns-default-metrics-tls" not found Apr 23 08:48:42.243746 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:42.243702 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jwnqb" event={"ID":"8e6d8633-b9de-4e37-96e7-465a09675f90","Type":"ContainerStarted","Data":"65892a0f33b9696614c0fd6be41337215cd4d146d3ac3c93af69001c6b7e9297"} Apr 23 08:48:42.245818 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:42.245790 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-q6kd2" event={"ID":"f0ec8bf3-b2b1-4ced-9317-2706c95af066","Type":"ContainerStarted","Data":"c5899a3688c6736c913ba59a16553a68777970cd0fe480cd532c6193715aea27"} Apr 23 08:48:42.266600 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:42.266541 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-jwnqb" podStartSLOduration=4.266975278 podStartE2EDuration="37.266523318s" podCreationTimestamp="2026-04-23 08:48:05 +0000 UTC" firstStartedPulling="2026-04-23 08:48:06.232672939 +0000 UTC m=+1.762152947" lastFinishedPulling="2026-04-23 08:48:39.232220982 +0000 UTC m=+34.761700987" observedRunningTime="2026-04-23 08:48:42.264645913 +0000 UTC m=+37.794125931" watchObservedRunningTime="2026-04-23 08:48:42.266523318 +0000 UTC m=+37.796003347" Apr 23 08:48:45.726580 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:45.726481 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/75c84a62-de6b-4301-9660-8d6ba1422a31-cert\") pod \"ingress-canary-dd2cl\" (UID: \"75c84a62-de6b-4301-9660-8d6ba1422a31\") " pod="openshift-ingress-canary/ingress-canary-dd2cl" Apr 23 08:48:45.726580 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:45.726528 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/533a84b7-1e94-4312-8393-c6c787af53b6-metrics-tls\") pod \"dns-default-sr575\" (UID: \"533a84b7-1e94-4312-8393-c6c787af53b6\") " pod="openshift-dns/dns-default-sr575" Apr 23 08:48:45.726985 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:45.726630 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 08:48:45.726985 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:45.726650 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 08:48:45.726985 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:45.726699 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75c84a62-de6b-4301-9660-8d6ba1422a31-cert podName:75c84a62-de6b-4301-9660-8d6ba1422a31 nodeName:}" failed. No retries permitted until 2026-04-23 08:48:53.726682223 +0000 UTC m=+49.256162243 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/75c84a62-de6b-4301-9660-8d6ba1422a31-cert") pod "ingress-canary-dd2cl" (UID: "75c84a62-de6b-4301-9660-8d6ba1422a31") : secret "canary-serving-cert" not found Apr 23 08:48:45.726985 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:45.726714 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/533a84b7-1e94-4312-8393-c6c787af53b6-metrics-tls podName:533a84b7-1e94-4312-8393-c6c787af53b6 nodeName:}" failed. No retries permitted until 2026-04-23 08:48:53.7267083 +0000 UTC m=+49.256188304 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/533a84b7-1e94-4312-8393-c6c787af53b6-metrics-tls") pod "dns-default-sr575" (UID: "533a84b7-1e94-4312-8393-c6c787af53b6") : secret "dns-default-metrics-tls" not found Apr 23 08:48:46.253875 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:46.253834 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-q6kd2" event={"ID":"f0ec8bf3-b2b1-4ced-9317-2706c95af066","Type":"ContainerStarted","Data":"f2d583d64ca15c673f92cd9b9ed81a0dfdc19d3ed14559e91eef0cf55adaffc3"} Apr 23 08:48:46.268143 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:46.268104 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-q6kd2" podStartSLOduration=33.307071898 podStartE2EDuration="37.268092052s" podCreationTimestamp="2026-04-23 08:48:09 +0000 UTC" firstStartedPulling="2026-04-23 08:48:41.49500377 +0000 UTC m=+37.024483787" lastFinishedPulling="2026-04-23 08:48:45.456023937 +0000 UTC m=+40.985503941" observedRunningTime="2026-04-23 08:48:46.26745408 +0000 UTC m=+41.796934104" watchObservedRunningTime="2026-04-23 08:48:46.268092052 +0000 UTC m=+41.797572078" Apr 23 08:48:53.782259 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:53.782225 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/75c84a62-de6b-4301-9660-8d6ba1422a31-cert\") pod \"ingress-canary-dd2cl\" (UID: \"75c84a62-de6b-4301-9660-8d6ba1422a31\") " pod="openshift-ingress-canary/ingress-canary-dd2cl" Apr 23 08:48:53.782259 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:48:53.782263 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/533a84b7-1e94-4312-8393-c6c787af53b6-metrics-tls\") pod \"dns-default-sr575\" (UID: \"533a84b7-1e94-4312-8393-c6c787af53b6\") " pod="openshift-dns/dns-default-sr575" Apr 23 08:48:53.782693 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:53.782397 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 08:48:53.782693 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:53.782467 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75c84a62-de6b-4301-9660-8d6ba1422a31-cert podName:75c84a62-de6b-4301-9660-8d6ba1422a31 nodeName:}" failed. No retries permitted until 2026-04-23 08:49:09.782451965 +0000 UTC m=+65.311931969 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/75c84a62-de6b-4301-9660-8d6ba1422a31-cert") pod "ingress-canary-dd2cl" (UID: "75c84a62-de6b-4301-9660-8d6ba1422a31") : secret "canary-serving-cert" not found Apr 23 08:48:53.782693 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:53.782397 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 08:48:53.782693 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:48:53.782529 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/533a84b7-1e94-4312-8393-c6c787af53b6-metrics-tls podName:533a84b7-1e94-4312-8393-c6c787af53b6 nodeName:}" failed. No retries permitted until 2026-04-23 08:49:09.782517913 +0000 UTC m=+65.311997924 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/533a84b7-1e94-4312-8393-c6c787af53b6-metrics-tls") pod "dns-default-sr575" (UID: "533a84b7-1e94-4312-8393-c6c787af53b6") : secret "dns-default-metrics-tls" not found Apr 23 08:49:04.230873 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:49:04.230843 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9bsnx" Apr 23 08:49:09.795403 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:49:09.795328 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/75c84a62-de6b-4301-9660-8d6ba1422a31-cert\") pod \"ingress-canary-dd2cl\" (UID: \"75c84a62-de6b-4301-9660-8d6ba1422a31\") " pod="openshift-ingress-canary/ingress-canary-dd2cl" Apr 23 08:49:09.795875 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:49:09.795413 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/533a84b7-1e94-4312-8393-c6c787af53b6-metrics-tls\") pod \"dns-default-sr575\" (UID: \"533a84b7-1e94-4312-8393-c6c787af53b6\") " pod="openshift-dns/dns-default-sr575" Apr 23 08:49:09.795875 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:49:09.795470 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f3e78a7f-f7cc-48d6-a29a-7d418c195aef-metrics-certs\") pod \"network-metrics-daemon-96m6d\" (UID: \"f3e78a7f-f7cc-48d6-a29a-7d418c195aef\") " pod="openshift-multus/network-metrics-daemon-96m6d" Apr 23 08:49:09.795875 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:49:09.795477 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 08:49:09.795875 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:49:09.795537 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75c84a62-de6b-4301-9660-8d6ba1422a31-cert podName:75c84a62-de6b-4301-9660-8d6ba1422a31 nodeName:}" failed. No retries permitted until 2026-04-23 08:49:41.795522516 +0000 UTC m=+97.325002520 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/75c84a62-de6b-4301-9660-8d6ba1422a31-cert") pod "ingress-canary-dd2cl" (UID: "75c84a62-de6b-4301-9660-8d6ba1422a31") : secret "canary-serving-cert" not found Apr 23 08:49:09.795875 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:49:09.795542 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 08:49:09.795875 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:49:09.795605 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/533a84b7-1e94-4312-8393-c6c787af53b6-metrics-tls podName:533a84b7-1e94-4312-8393-c6c787af53b6 nodeName:}" failed. No retries permitted until 2026-04-23 08:49:41.795588809 +0000 UTC m=+97.325068821 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/533a84b7-1e94-4312-8393-c6c787af53b6-metrics-tls") pod "dns-default-sr575" (UID: "533a84b7-1e94-4312-8393-c6c787af53b6") : secret "dns-default-metrics-tls" not found Apr 23 08:49:09.797912 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:49:09.797893 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 23 08:49:09.806219 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:49:09.806204 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 23 08:49:09.806316 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:49:09.806269 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3e78a7f-f7cc-48d6-a29a-7d418c195aef-metrics-certs podName:f3e78a7f-f7cc-48d6-a29a-7d418c195aef nodeName:}" failed. No retries permitted until 2026-04-23 08:50:13.806251806 +0000 UTC m=+129.335731823 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f3e78a7f-f7cc-48d6-a29a-7d418c195aef-metrics-certs") pod "network-metrics-daemon-96m6d" (UID: "f3e78a7f-f7cc-48d6-a29a-7d418c195aef") : secret "metrics-daemon-secret" not found Apr 23 08:49:09.896800 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:49:09.896761 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qfrd2\" (UniqueName: \"kubernetes.io/projected/bb54c9e2-2568-441d-a59a-ffa007afefd6-kube-api-access-qfrd2\") pod \"network-check-target-4mjb6\" (UID: \"bb54c9e2-2568-441d-a59a-ffa007afefd6\") " pod="openshift-network-diagnostics/network-check-target-4mjb6" Apr 23 08:49:09.899368 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:49:09.899327 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 23 08:49:09.910005 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:49:09.909980 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 23 08:49:09.935741 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:49:09.935697 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfrd2\" (UniqueName: \"kubernetes.io/projected/bb54c9e2-2568-441d-a59a-ffa007afefd6-kube-api-access-qfrd2\") pod \"network-check-target-4mjb6\" (UID: \"bb54c9e2-2568-441d-a59a-ffa007afefd6\") " pod="openshift-network-diagnostics/network-check-target-4mjb6" Apr 23 08:49:10.150382 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:49:10.150360 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-h75n2\"" Apr 23 08:49:10.159044 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:49:10.159027 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4mjb6" Apr 23 08:49:10.282833 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:49:10.282802 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-4mjb6"] Apr 23 08:49:10.285896 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:49:10.285855 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb54c9e2_2568_441d_a59a_ffa007afefd6.slice/crio-8e8bdd985cd91a4379b678f59321faa8761a0eaebd9fb7e46e5b6e4fc058974a WatchSource:0}: Error finding container 8e8bdd985cd91a4379b678f59321faa8761a0eaebd9fb7e46e5b6e4fc058974a: Status 404 returned error can't find the container with id 8e8bdd985cd91a4379b678f59321faa8761a0eaebd9fb7e46e5b6e4fc058974a Apr 23 08:49:10.302311 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:49:10.302282 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-4mjb6" event={"ID":"bb54c9e2-2568-441d-a59a-ffa007afefd6","Type":"ContainerStarted","Data":"8e8bdd985cd91a4379b678f59321faa8761a0eaebd9fb7e46e5b6e4fc058974a"} Apr 23 08:49:13.309398 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:49:13.309293 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-4mjb6" event={"ID":"bb54c9e2-2568-441d-a59a-ffa007afefd6","Type":"ContainerStarted","Data":"511a85d311d81e4a9d8c86a924fc0b22b5300abc6cc0a6428eb9f89f7b47d60a"} Apr 23 08:49:13.309760 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:49:13.309441 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-4mjb6" Apr 23 08:49:13.327682 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:49:13.327633 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-4mjb6" podStartSLOduration=65.627879408 podStartE2EDuration="1m8.327621145s" podCreationTimestamp="2026-04-23 08:48:05 +0000 UTC" firstStartedPulling="2026-04-23 08:49:10.287672196 +0000 UTC m=+65.817152203" lastFinishedPulling="2026-04-23 08:49:12.987413932 +0000 UTC m=+68.516893940" observedRunningTime="2026-04-23 08:49:13.327571518 +0000 UTC m=+68.857051565" watchObservedRunningTime="2026-04-23 08:49:13.327621145 +0000 UTC m=+68.857101163" Apr 23 08:49:41.825052 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:49:41.824890 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/533a84b7-1e94-4312-8393-c6c787af53b6-metrics-tls\") pod \"dns-default-sr575\" (UID: \"533a84b7-1e94-4312-8393-c6c787af53b6\") " pod="openshift-dns/dns-default-sr575" Apr 23 08:49:41.825052 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:49:41.824978 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/75c84a62-de6b-4301-9660-8d6ba1422a31-cert\") pod \"ingress-canary-dd2cl\" (UID: \"75c84a62-de6b-4301-9660-8d6ba1422a31\") " pod="openshift-ingress-canary/ingress-canary-dd2cl" Apr 23 08:49:41.825052 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:49:41.825055 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 08:49:41.825052 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:49:41.825051 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 08:49:41.825629 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:49:41.825107 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75c84a62-de6b-4301-9660-8d6ba1422a31-cert podName:75c84a62-de6b-4301-9660-8d6ba1422a31 nodeName:}" failed. No retries permitted until 2026-04-23 08:50:45.825093431 +0000 UTC m=+161.354573435 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/75c84a62-de6b-4301-9660-8d6ba1422a31-cert") pod "ingress-canary-dd2cl" (UID: "75c84a62-de6b-4301-9660-8d6ba1422a31") : secret "canary-serving-cert" not found Apr 23 08:49:41.825629 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:49:41.825132 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/533a84b7-1e94-4312-8393-c6c787af53b6-metrics-tls podName:533a84b7-1e94-4312-8393-c6c787af53b6 nodeName:}" failed. No retries permitted until 2026-04-23 08:50:45.825111434 +0000 UTC m=+161.354591439 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/533a84b7-1e94-4312-8393-c6c787af53b6-metrics-tls") pod "dns-default-sr575" (UID: "533a84b7-1e94-4312-8393-c6c787af53b6") : secret "dns-default-metrics-tls" not found Apr 23 08:49:44.313969 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:49:44.313931 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-4mjb6" Apr 23 08:50:03.053598 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:03.053561 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-7ftql"] Apr 23 08:50:03.057069 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:03.057035 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-st52h"] Apr 23 08:50:03.057462 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:03.057442 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-7ftql" Apr 23 08:50:03.059705 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:03.059681 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 23 08:50:03.060242 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:03.060221 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-84bcc947b6-dxbqx"] Apr 23 08:50:03.060389 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:03.060372 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-st52h" Apr 23 08:50:03.060446 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:03.060393 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 23 08:50:03.060446 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:03.060427 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 23 08:50:03.063094 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:03.063051 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-84bcc947b6-dxbqx" Apr 23 08:50:03.064904 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:03.064830 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 23 08:50:03.066796 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:03.066776 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-nhzhz\"" Apr 23 08:50:03.066940 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:03.066924 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 23 08:50:03.067969 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:03.067896 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 23 08:50:03.067969 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:03.067947 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 23 08:50:03.067969 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:03.067958 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-g2b9m\"" Apr 23 08:50:03.067969 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:03.067963 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 23 08:50:03.068196 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:03.067986 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 23 08:50:03.068196 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:03.068168 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 23 08:50:03.068318 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:03.068210 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 23 08:50:03.068318 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:03.068234 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-r69jp\"" Apr 23 08:50:03.068318 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:03.068229 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 23 08:50:03.068672 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:03.068654 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 23 08:50:03.072423 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:03.072408 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 23 08:50:03.076968 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:03.076946 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-7ftql"] Apr 23 08:50:03.077737 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:03.077715 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-st52h"] Apr 23 08:50:03.082429 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:03.082408 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-84bcc947b6-dxbqx"] Apr 23 08:50:03.152431 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:03.152404 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-qh5vs"] Apr 23 08:50:03.155067 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:03.155053 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qh5vs" Apr 23 08:50:03.157838 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:03.157816 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 23 08:50:03.157838 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:03.157839 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 23 08:50:03.158037 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:03.157857 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 23 08:50:03.158037 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:03.157862 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 23 08:50:03.158037 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:03.157906 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-68f6g\"" Apr 23 08:50:03.163727 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:03.163704 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/dcc4b0c4-24d9-42a8-826a-aeb82cce2a08-stats-auth\") pod \"router-default-84bcc947b6-dxbqx\" (UID: \"dcc4b0c4-24d9-42a8-826a-aeb82cce2a08\") " pod="openshift-ingress/router-default-84bcc947b6-dxbqx" Apr 23 08:50:03.163831 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:03.163743 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dcc4b0c4-24d9-42a8-826a-aeb82cce2a08-metrics-certs\") pod \"router-default-84bcc947b6-dxbqx\" (UID: \"dcc4b0c4-24d9-42a8-826a-aeb82cce2a08\") " pod="openshift-ingress/router-default-84bcc947b6-dxbqx" Apr 23 08:50:03.163831 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:03.163785 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dcc4b0c4-24d9-42a8-826a-aeb82cce2a08-service-ca-bundle\") pod \"router-default-84bcc947b6-dxbqx\" (UID: \"dcc4b0c4-24d9-42a8-826a-aeb82cce2a08\") " pod="openshift-ingress/router-default-84bcc947b6-dxbqx" Apr 23 08:50:03.163831 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:03.163811 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ht7m9\" (UniqueName: \"kubernetes.io/projected/dcc4b0c4-24d9-42a8-826a-aeb82cce2a08-kube-api-access-ht7m9\") pod \"router-default-84bcc947b6-dxbqx\" (UID: \"dcc4b0c4-24d9-42a8-826a-aeb82cce2a08\") " pod="openshift-ingress/router-default-84bcc947b6-dxbqx" Apr 23 08:50:03.163988 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:03.163862 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e2935cf-4651-42ac-bd9e-54accc810a7a-config\") pod \"console-operator-9d4b6777b-7ftql\" (UID: \"8e2935cf-4651-42ac-bd9e-54accc810a7a\") " pod="openshift-console-operator/console-operator-9d4b6777b-7ftql" Apr 23 08:50:03.163988 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:03.163897 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8e2935cf-4651-42ac-bd9e-54accc810a7a-trusted-ca\") pod \"console-operator-9d4b6777b-7ftql\" (UID: \"8e2935cf-4651-42ac-bd9e-54accc810a7a\") " pod="openshift-console-operator/console-operator-9d4b6777b-7ftql" Apr 23 08:50:03.164076 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:03.163980 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/dcc4b0c4-24d9-42a8-826a-aeb82cce2a08-default-certificate\") pod \"router-default-84bcc947b6-dxbqx\" (UID: \"dcc4b0c4-24d9-42a8-826a-aeb82cce2a08\") " pod="openshift-ingress/router-default-84bcc947b6-dxbqx" Apr 23 08:50:03.164076 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:03.164014 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/b0e5f2a3-2fce-4078-99b7-c4be47e87be7-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-st52h\" (UID: \"b0e5f2a3-2fce-4078-99b7-c4be47e87be7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-st52h" Apr 23 08:50:03.164076 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:03.164032 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e2935cf-4651-42ac-bd9e-54accc810a7a-serving-cert\") pod \"console-operator-9d4b6777b-7ftql\" (UID: \"8e2935cf-4651-42ac-bd9e-54accc810a7a\") " pod="openshift-console-operator/console-operator-9d4b6777b-7ftql" Apr 23 08:50:03.164076 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:03.164051 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vd6pl\" (UniqueName: \"kubernetes.io/projected/8e2935cf-4651-42ac-bd9e-54accc810a7a-kube-api-access-vd6pl\") pod \"console-operator-9d4b6777b-7ftql\" (UID: \"8e2935cf-4651-42ac-bd9e-54accc810a7a\") " pod="openshift-console-operator/console-operator-9d4b6777b-7ftql" Apr 23 08:50:03.164264 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:03.164083 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ptvx\" (UniqueName: \"kubernetes.io/projected/b0e5f2a3-2fce-4078-99b7-c4be47e87be7-kube-api-access-7ptvx\") pod \"cluster-samples-operator-6dc5bdb6b4-st52h\" (UID: \"b0e5f2a3-2fce-4078-99b7-c4be47e87be7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-st52h" Apr 23 08:50:03.164889 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:03.164868 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-qh5vs"] Apr 23 08:50:03.265173 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:03.265141 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/dcc4b0c4-24d9-42a8-826a-aeb82cce2a08-default-certificate\") pod \"router-default-84bcc947b6-dxbqx\" (UID: \"dcc4b0c4-24d9-42a8-826a-aeb82cce2a08\") " pod="openshift-ingress/router-default-84bcc947b6-dxbqx" Apr 23 08:50:03.265283 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:03.265190 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/b0e5f2a3-2fce-4078-99b7-c4be47e87be7-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-st52h\" (UID: \"b0e5f2a3-2fce-4078-99b7-c4be47e87be7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-st52h" Apr 23 08:50:03.265283 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:03.265219 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e2935cf-4651-42ac-bd9e-54accc810a7a-serving-cert\") pod \"console-operator-9d4b6777b-7ftql\" (UID: \"8e2935cf-4651-42ac-bd9e-54accc810a7a\") " pod="openshift-console-operator/console-operator-9d4b6777b-7ftql" Apr 23 08:50:03.265283 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:03.265246 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vd6pl\" (UniqueName: \"kubernetes.io/projected/8e2935cf-4651-42ac-bd9e-54accc810a7a-kube-api-access-vd6pl\") pod \"console-operator-9d4b6777b-7ftql\" (UID: \"8e2935cf-4651-42ac-bd9e-54accc810a7a\") " pod="openshift-console-operator/console-operator-9d4b6777b-7ftql" Apr 23 08:50:03.265472 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:03.265279 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/450d7653-7e13-4231-acf1-3736103fe1fd-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-qh5vs\" (UID: \"450d7653-7e13-4231-acf1-3736103fe1fd\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qh5vs" Apr 23 08:50:03.265472 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:03.265314 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7ptvx\" (UniqueName: \"kubernetes.io/projected/b0e5f2a3-2fce-4078-99b7-c4be47e87be7-kube-api-access-7ptvx\") pod \"cluster-samples-operator-6dc5bdb6b4-st52h\" (UID: \"b0e5f2a3-2fce-4078-99b7-c4be47e87be7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-st52h" Apr 23 08:50:03.265472 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:03.265368 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/450d7653-7e13-4231-acf1-3736103fe1fd-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-qh5vs\" (UID: \"450d7653-7e13-4231-acf1-3736103fe1fd\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qh5vs" Apr 23 08:50:03.265472 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:50:03.265380 2579 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 23 08:50:03.265472 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:03.265434 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/dcc4b0c4-24d9-42a8-826a-aeb82cce2a08-stats-auth\") pod \"router-default-84bcc947b6-dxbqx\" (UID: \"dcc4b0c4-24d9-42a8-826a-aeb82cce2a08\") " pod="openshift-ingress/router-default-84bcc947b6-dxbqx" Apr 23 08:50:03.265472 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:03.265466 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dcc4b0c4-24d9-42a8-826a-aeb82cce2a08-metrics-certs\") pod \"router-default-84bcc947b6-dxbqx\" (UID: \"dcc4b0c4-24d9-42a8-826a-aeb82cce2a08\") " pod="openshift-ingress/router-default-84bcc947b6-dxbqx" Apr 23 08:50:03.265777 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:03.265505 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dcc4b0c4-24d9-42a8-826a-aeb82cce2a08-service-ca-bundle\") pod \"router-default-84bcc947b6-dxbqx\" (UID: \"dcc4b0c4-24d9-42a8-826a-aeb82cce2a08\") " pod="openshift-ingress/router-default-84bcc947b6-dxbqx" Apr 23 08:50:03.265777 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:50:03.265546 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b0e5f2a3-2fce-4078-99b7-c4be47e87be7-samples-operator-tls podName:b0e5f2a3-2fce-4078-99b7-c4be47e87be7 nodeName:}" failed. No retries permitted until 2026-04-23 08:50:03.765523801 +0000 UTC m=+119.295003805 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/b0e5f2a3-2fce-4078-99b7-c4be47e87be7-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-st52h" (UID: "b0e5f2a3-2fce-4078-99b7-c4be47e87be7") : secret "samples-operator-tls" not found Apr 23 08:50:03.265777 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:03.265572 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ht7m9\" (UniqueName: \"kubernetes.io/projected/dcc4b0c4-24d9-42a8-826a-aeb82cce2a08-kube-api-access-ht7m9\") pod \"router-default-84bcc947b6-dxbqx\" (UID: \"dcc4b0c4-24d9-42a8-826a-aeb82cce2a08\") " pod="openshift-ingress/router-default-84bcc947b6-dxbqx" Apr 23 08:50:03.265777 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:03.265606 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e2935cf-4651-42ac-bd9e-54accc810a7a-config\") pod \"console-operator-9d4b6777b-7ftql\" (UID: \"8e2935cf-4651-42ac-bd9e-54accc810a7a\") " pod="openshift-console-operator/console-operator-9d4b6777b-7ftql" Apr 23 08:50:03.265777 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:50:03.265633 2579 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 23 08:50:03.265777 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:50:03.265635 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/dcc4b0c4-24d9-42a8-826a-aeb82cce2a08-service-ca-bundle podName:dcc4b0c4-24d9-42a8-826a-aeb82cce2a08 nodeName:}" failed. No retries permitted until 2026-04-23 08:50:03.765619706 +0000 UTC m=+119.295099734 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/dcc4b0c4-24d9-42a8-826a-aeb82cce2a08-service-ca-bundle") pod "router-default-84bcc947b6-dxbqx" (UID: "dcc4b0c4-24d9-42a8-826a-aeb82cce2a08") : configmap references non-existent config key: service-ca.crt Apr 23 08:50:03.265777 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:03.265725 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8e2935cf-4651-42ac-bd9e-54accc810a7a-trusted-ca\") pod \"console-operator-9d4b6777b-7ftql\" (UID: \"8e2935cf-4651-42ac-bd9e-54accc810a7a\") " pod="openshift-console-operator/console-operator-9d4b6777b-7ftql" Apr 23 08:50:03.266088 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:03.265777 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwg2v\" (UniqueName: \"kubernetes.io/projected/450d7653-7e13-4231-acf1-3736103fe1fd-kube-api-access-xwg2v\") pod \"cluster-monitoring-operator-75587bd455-qh5vs\" (UID: \"450d7653-7e13-4231-acf1-3736103fe1fd\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qh5vs" Apr 23 08:50:03.266088 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:50:03.265847 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dcc4b0c4-24d9-42a8-826a-aeb82cce2a08-metrics-certs podName:dcc4b0c4-24d9-42a8-826a-aeb82cce2a08 nodeName:}" failed. No retries permitted until 2026-04-23 08:50:03.76582977 +0000 UTC m=+119.295309781 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dcc4b0c4-24d9-42a8-826a-aeb82cce2a08-metrics-certs") pod "router-default-84bcc947b6-dxbqx" (UID: "dcc4b0c4-24d9-42a8-826a-aeb82cce2a08") : secret "router-metrics-certs-default" not found Apr 23 08:50:03.266522 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:03.266502 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e2935cf-4651-42ac-bd9e-54accc810a7a-config\") pod \"console-operator-9d4b6777b-7ftql\" (UID: \"8e2935cf-4651-42ac-bd9e-54accc810a7a\") " pod="openshift-console-operator/console-operator-9d4b6777b-7ftql" Apr 23 08:50:03.266522 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:03.266514 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8e2935cf-4651-42ac-bd9e-54accc810a7a-trusted-ca\") pod \"console-operator-9d4b6777b-7ftql\" (UID: \"8e2935cf-4651-42ac-bd9e-54accc810a7a\") " pod="openshift-console-operator/console-operator-9d4b6777b-7ftql" Apr 23 08:50:03.267650 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:03.267633 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e2935cf-4651-42ac-bd9e-54accc810a7a-serving-cert\") pod \"console-operator-9d4b6777b-7ftql\" (UID: \"8e2935cf-4651-42ac-bd9e-54accc810a7a\") " pod="openshift-console-operator/console-operator-9d4b6777b-7ftql" Apr 23 08:50:03.268307 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:03.268292 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/dcc4b0c4-24d9-42a8-826a-aeb82cce2a08-stats-auth\") pod \"router-default-84bcc947b6-dxbqx\" (UID: \"dcc4b0c4-24d9-42a8-826a-aeb82cce2a08\") " pod="openshift-ingress/router-default-84bcc947b6-dxbqx" Apr 23 08:50:03.268463 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:03.268444 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/dcc4b0c4-24d9-42a8-826a-aeb82cce2a08-default-certificate\") pod \"router-default-84bcc947b6-dxbqx\" (UID: \"dcc4b0c4-24d9-42a8-826a-aeb82cce2a08\") " pod="openshift-ingress/router-default-84bcc947b6-dxbqx" Apr 23 08:50:03.273248 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:03.273229 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ptvx\" (UniqueName: \"kubernetes.io/projected/b0e5f2a3-2fce-4078-99b7-c4be47e87be7-kube-api-access-7ptvx\") pod \"cluster-samples-operator-6dc5bdb6b4-st52h\" (UID: \"b0e5f2a3-2fce-4078-99b7-c4be47e87be7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-st52h" Apr 23 08:50:03.273588 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:03.273570 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vd6pl\" (UniqueName: \"kubernetes.io/projected/8e2935cf-4651-42ac-bd9e-54accc810a7a-kube-api-access-vd6pl\") pod \"console-operator-9d4b6777b-7ftql\" (UID: \"8e2935cf-4651-42ac-bd9e-54accc810a7a\") " pod="openshift-console-operator/console-operator-9d4b6777b-7ftql" Apr 23 08:50:03.273654 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:03.273619 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ht7m9\" (UniqueName: \"kubernetes.io/projected/dcc4b0c4-24d9-42a8-826a-aeb82cce2a08-kube-api-access-ht7m9\") pod \"router-default-84bcc947b6-dxbqx\" (UID: \"dcc4b0c4-24d9-42a8-826a-aeb82cce2a08\") " pod="openshift-ingress/router-default-84bcc947b6-dxbqx" Apr 23 08:50:03.366914 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:03.366821 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/450d7653-7e13-4231-acf1-3736103fe1fd-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-qh5vs\" (UID: \"450d7653-7e13-4231-acf1-3736103fe1fd\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qh5vs" Apr 23 08:50:03.367032 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:03.366977 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xwg2v\" (UniqueName: \"kubernetes.io/projected/450d7653-7e13-4231-acf1-3736103fe1fd-kube-api-access-xwg2v\") pod \"cluster-monitoring-operator-75587bd455-qh5vs\" (UID: \"450d7653-7e13-4231-acf1-3736103fe1fd\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qh5vs" Apr 23 08:50:03.367077 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:03.367042 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/450d7653-7e13-4231-acf1-3736103fe1fd-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-qh5vs\" (UID: \"450d7653-7e13-4231-acf1-3736103fe1fd\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qh5vs" Apr 23 08:50:03.367161 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:50:03.367144 2579 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 23 08:50:03.367239 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:50:03.367221 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/450d7653-7e13-4231-acf1-3736103fe1fd-cluster-monitoring-operator-tls podName:450d7653-7e13-4231-acf1-3736103fe1fd nodeName:}" failed. No retries permitted until 2026-04-23 08:50:03.867202546 +0000 UTC m=+119.396682569 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/450d7653-7e13-4231-acf1-3736103fe1fd-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-qh5vs" (UID: "450d7653-7e13-4231-acf1-3736103fe1fd") : secret "cluster-monitoring-operator-tls" not found Apr 23 08:50:03.367598 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:03.367581 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/450d7653-7e13-4231-acf1-3736103fe1fd-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-qh5vs\" (UID: \"450d7653-7e13-4231-acf1-3736103fe1fd\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qh5vs" Apr 23 08:50:03.367691 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:03.367678 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-7ftql" Apr 23 08:50:03.376832 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:03.376810 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwg2v\" (UniqueName: \"kubernetes.io/projected/450d7653-7e13-4231-acf1-3736103fe1fd-kube-api-access-xwg2v\") pod \"cluster-monitoring-operator-75587bd455-qh5vs\" (UID: \"450d7653-7e13-4231-acf1-3736103fe1fd\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qh5vs" Apr 23 08:50:03.477661 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:03.477620 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-7ftql"] Apr 23 08:50:03.480200 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:50:03.480172 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e2935cf_4651_42ac_bd9e_54accc810a7a.slice/crio-c5ef9bbf1cd3064394f1fcb5e0aaea39b97c158eeb90241d0258e492b7574bf9 WatchSource:0}: Error finding container c5ef9bbf1cd3064394f1fcb5e0aaea39b97c158eeb90241d0258e492b7574bf9: Status 404 returned error can't find the container with id c5ef9bbf1cd3064394f1fcb5e0aaea39b97c158eeb90241d0258e492b7574bf9 Apr 23 08:50:03.771089 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:03.771047 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dcc4b0c4-24d9-42a8-826a-aeb82cce2a08-service-ca-bundle\") pod \"router-default-84bcc947b6-dxbqx\" (UID: \"dcc4b0c4-24d9-42a8-826a-aeb82cce2a08\") " pod="openshift-ingress/router-default-84bcc947b6-dxbqx" Apr 23 08:50:03.771264 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:03.771112 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/b0e5f2a3-2fce-4078-99b7-c4be47e87be7-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-st52h\" (UID: \"b0e5f2a3-2fce-4078-99b7-c4be47e87be7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-st52h" Apr 23 08:50:03.771264 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:03.771171 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dcc4b0c4-24d9-42a8-826a-aeb82cce2a08-metrics-certs\") pod \"router-default-84bcc947b6-dxbqx\" (UID: \"dcc4b0c4-24d9-42a8-826a-aeb82cce2a08\") " pod="openshift-ingress/router-default-84bcc947b6-dxbqx" Apr 23 08:50:03.771264 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:50:03.771214 2579 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 23 08:50:03.771264 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:50:03.771256 2579 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 23 08:50:03.771428 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:50:03.771263 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/dcc4b0c4-24d9-42a8-826a-aeb82cce2a08-service-ca-bundle podName:dcc4b0c4-24d9-42a8-826a-aeb82cce2a08 nodeName:}" failed. No retries permitted until 2026-04-23 08:50:04.771241994 +0000 UTC m=+120.300722011 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/dcc4b0c4-24d9-42a8-826a-aeb82cce2a08-service-ca-bundle") pod "router-default-84bcc947b6-dxbqx" (UID: "dcc4b0c4-24d9-42a8-826a-aeb82cce2a08") : configmap references non-existent config key: service-ca.crt Apr 23 08:50:03.771428 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:50:03.771283 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b0e5f2a3-2fce-4078-99b7-c4be47e87be7-samples-operator-tls podName:b0e5f2a3-2fce-4078-99b7-c4be47e87be7 nodeName:}" failed. No retries permitted until 2026-04-23 08:50:04.771276 +0000 UTC m=+120.300756004 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/b0e5f2a3-2fce-4078-99b7-c4be47e87be7-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-st52h" (UID: "b0e5f2a3-2fce-4078-99b7-c4be47e87be7") : secret "samples-operator-tls" not found Apr 23 08:50:03.771428 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:50:03.771299 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dcc4b0c4-24d9-42a8-826a-aeb82cce2a08-metrics-certs podName:dcc4b0c4-24d9-42a8-826a-aeb82cce2a08 nodeName:}" failed. No retries permitted until 2026-04-23 08:50:04.771288246 +0000 UTC m=+120.300768250 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dcc4b0c4-24d9-42a8-826a-aeb82cce2a08-metrics-certs") pod "router-default-84bcc947b6-dxbqx" (UID: "dcc4b0c4-24d9-42a8-826a-aeb82cce2a08") : secret "router-metrics-certs-default" not found Apr 23 08:50:03.872110 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:03.872073 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/450d7653-7e13-4231-acf1-3736103fe1fd-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-qh5vs\" (UID: \"450d7653-7e13-4231-acf1-3736103fe1fd\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qh5vs" Apr 23 08:50:03.872240 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:50:03.872178 2579 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 23 08:50:03.872240 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:50:03.872229 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/450d7653-7e13-4231-acf1-3736103fe1fd-cluster-monitoring-operator-tls podName:450d7653-7e13-4231-acf1-3736103fe1fd nodeName:}" failed. No retries permitted until 2026-04-23 08:50:04.87221431 +0000 UTC m=+120.401694331 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/450d7653-7e13-4231-acf1-3736103fe1fd-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-qh5vs" (UID: "450d7653-7e13-4231-acf1-3736103fe1fd") : secret "cluster-monitoring-operator-tls" not found Apr 23 08:50:04.402268 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:04.402229 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-7ftql" event={"ID":"8e2935cf-4651-42ac-bd9e-54accc810a7a","Type":"ContainerStarted","Data":"c5ef9bbf1cd3064394f1fcb5e0aaea39b97c158eeb90241d0258e492b7574bf9"} Apr 23 08:50:04.782719 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:04.782632 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dcc4b0c4-24d9-42a8-826a-aeb82cce2a08-metrics-certs\") pod \"router-default-84bcc947b6-dxbqx\" (UID: \"dcc4b0c4-24d9-42a8-826a-aeb82cce2a08\") " pod="openshift-ingress/router-default-84bcc947b6-dxbqx" Apr 23 08:50:04.782719 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:04.782692 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dcc4b0c4-24d9-42a8-826a-aeb82cce2a08-service-ca-bundle\") pod \"router-default-84bcc947b6-dxbqx\" (UID: \"dcc4b0c4-24d9-42a8-826a-aeb82cce2a08\") " pod="openshift-ingress/router-default-84bcc947b6-dxbqx" Apr 23 08:50:04.782948 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:04.782750 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/b0e5f2a3-2fce-4078-99b7-c4be47e87be7-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-st52h\" (UID: \"b0e5f2a3-2fce-4078-99b7-c4be47e87be7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-st52h" Apr 23 08:50:04.782948 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:50:04.782781 2579 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 23 08:50:04.782948 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:50:04.782843 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dcc4b0c4-24d9-42a8-826a-aeb82cce2a08-metrics-certs podName:dcc4b0c4-24d9-42a8-826a-aeb82cce2a08 nodeName:}" failed. No retries permitted until 2026-04-23 08:50:06.782827081 +0000 UTC m=+122.312307084 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dcc4b0c4-24d9-42a8-826a-aeb82cce2a08-metrics-certs") pod "router-default-84bcc947b6-dxbqx" (UID: "dcc4b0c4-24d9-42a8-826a-aeb82cce2a08") : secret "router-metrics-certs-default" not found Apr 23 08:50:04.782948 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:50:04.782860 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/dcc4b0c4-24d9-42a8-826a-aeb82cce2a08-service-ca-bundle podName:dcc4b0c4-24d9-42a8-826a-aeb82cce2a08 nodeName:}" failed. No retries permitted until 2026-04-23 08:50:06.782851434 +0000 UTC m=+122.312331447 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/dcc4b0c4-24d9-42a8-826a-aeb82cce2a08-service-ca-bundle") pod "router-default-84bcc947b6-dxbqx" (UID: "dcc4b0c4-24d9-42a8-826a-aeb82cce2a08") : configmap references non-existent config key: service-ca.crt Apr 23 08:50:04.782948 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:50:04.782882 2579 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 23 08:50:04.782948 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:50:04.782951 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b0e5f2a3-2fce-4078-99b7-c4be47e87be7-samples-operator-tls podName:b0e5f2a3-2fce-4078-99b7-c4be47e87be7 nodeName:}" failed. No retries permitted until 2026-04-23 08:50:06.782935317 +0000 UTC m=+122.312415338 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/b0e5f2a3-2fce-4078-99b7-c4be47e87be7-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-st52h" (UID: "b0e5f2a3-2fce-4078-99b7-c4be47e87be7") : secret "samples-operator-tls" not found Apr 23 08:50:04.883581 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:04.883541 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/450d7653-7e13-4231-acf1-3736103fe1fd-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-qh5vs\" (UID: \"450d7653-7e13-4231-acf1-3736103fe1fd\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qh5vs" Apr 23 08:50:04.883733 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:50:04.883689 2579 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 23 08:50:04.883784 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:50:04.883766 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/450d7653-7e13-4231-acf1-3736103fe1fd-cluster-monitoring-operator-tls podName:450d7653-7e13-4231-acf1-3736103fe1fd nodeName:}" failed. No retries permitted until 2026-04-23 08:50:06.883744759 +0000 UTC m=+122.413224784 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/450d7653-7e13-4231-acf1-3736103fe1fd-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-qh5vs" (UID: "450d7653-7e13-4231-acf1-3736103fe1fd") : secret "cluster-monitoring-operator-tls" not found Apr 23 08:50:06.407356 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:06.407307 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7ftql_8e2935cf-4651-42ac-bd9e-54accc810a7a/console-operator/0.log" Apr 23 08:50:06.407713 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:06.407375 2579 generic.go:358] "Generic (PLEG): container finished" podID="8e2935cf-4651-42ac-bd9e-54accc810a7a" containerID="a46acc9e3cf371f1d84cd54a15ed692416c3fe8b8ba5fbd7822d98d865aa3da3" exitCode=255 Apr 23 08:50:06.407713 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:06.407430 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-7ftql" event={"ID":"8e2935cf-4651-42ac-bd9e-54accc810a7a","Type":"ContainerDied","Data":"a46acc9e3cf371f1d84cd54a15ed692416c3fe8b8ba5fbd7822d98d865aa3da3"} Apr 23 08:50:06.407713 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:06.407633 2579 scope.go:117] "RemoveContainer" containerID="a46acc9e3cf371f1d84cd54a15ed692416c3fe8b8ba5fbd7822d98d865aa3da3" Apr 23 08:50:06.799920 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:06.799821 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dcc4b0c4-24d9-42a8-826a-aeb82cce2a08-service-ca-bundle\") pod \"router-default-84bcc947b6-dxbqx\" (UID: \"dcc4b0c4-24d9-42a8-826a-aeb82cce2a08\") " pod="openshift-ingress/router-default-84bcc947b6-dxbqx" Apr 23 08:50:06.799920 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:06.799893 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/b0e5f2a3-2fce-4078-99b7-c4be47e87be7-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-st52h\" (UID: \"b0e5f2a3-2fce-4078-99b7-c4be47e87be7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-st52h" Apr 23 08:50:06.800123 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:06.799941 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dcc4b0c4-24d9-42a8-826a-aeb82cce2a08-metrics-certs\") pod \"router-default-84bcc947b6-dxbqx\" (UID: \"dcc4b0c4-24d9-42a8-826a-aeb82cce2a08\") " pod="openshift-ingress/router-default-84bcc947b6-dxbqx" Apr 23 08:50:06.800123 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:50:06.799981 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/dcc4b0c4-24d9-42a8-826a-aeb82cce2a08-service-ca-bundle podName:dcc4b0c4-24d9-42a8-826a-aeb82cce2a08 nodeName:}" failed. No retries permitted until 2026-04-23 08:50:10.799964419 +0000 UTC m=+126.329444430 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/dcc4b0c4-24d9-42a8-826a-aeb82cce2a08-service-ca-bundle") pod "router-default-84bcc947b6-dxbqx" (UID: "dcc4b0c4-24d9-42a8-826a-aeb82cce2a08") : configmap references non-existent config key: service-ca.crt Apr 23 08:50:06.800123 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:50:06.800025 2579 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 23 08:50:06.800123 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:50:06.800036 2579 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 23 08:50:06.800123 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:50:06.800070 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dcc4b0c4-24d9-42a8-826a-aeb82cce2a08-metrics-certs podName:dcc4b0c4-24d9-42a8-826a-aeb82cce2a08 nodeName:}" failed. No retries permitted until 2026-04-23 08:50:10.80005923 +0000 UTC m=+126.329539234 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dcc4b0c4-24d9-42a8-826a-aeb82cce2a08-metrics-certs") pod "router-default-84bcc947b6-dxbqx" (UID: "dcc4b0c4-24d9-42a8-826a-aeb82cce2a08") : secret "router-metrics-certs-default" not found Apr 23 08:50:06.800123 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:50:06.800087 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b0e5f2a3-2fce-4078-99b7-c4be47e87be7-samples-operator-tls podName:b0e5f2a3-2fce-4078-99b7-c4be47e87be7 nodeName:}" failed. No retries permitted until 2026-04-23 08:50:10.800075036 +0000 UTC m=+126.329555039 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/b0e5f2a3-2fce-4078-99b7-c4be47e87be7-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-st52h" (UID: "b0e5f2a3-2fce-4078-99b7-c4be47e87be7") : secret "samples-operator-tls" not found Apr 23 08:50:06.900714 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:06.900670 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/450d7653-7e13-4231-acf1-3736103fe1fd-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-qh5vs\" (UID: \"450d7653-7e13-4231-acf1-3736103fe1fd\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qh5vs" Apr 23 08:50:06.900878 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:50:06.900814 2579 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 23 08:50:06.900878 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:50:06.900875 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/450d7653-7e13-4231-acf1-3736103fe1fd-cluster-monitoring-operator-tls podName:450d7653-7e13-4231-acf1-3736103fe1fd nodeName:}" failed. No retries permitted until 2026-04-23 08:50:10.900860011 +0000 UTC m=+126.430340015 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/450d7653-7e13-4231-acf1-3736103fe1fd-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-qh5vs" (UID: "450d7653-7e13-4231-acf1-3736103fe1fd") : secret "cluster-monitoring-operator-tls" not found Apr 23 08:50:07.411054 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:07.411028 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7ftql_8e2935cf-4651-42ac-bd9e-54accc810a7a/console-operator/1.log" Apr 23 08:50:07.411481 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:07.411327 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7ftql_8e2935cf-4651-42ac-bd9e-54accc810a7a/console-operator/0.log" Apr 23 08:50:07.411481 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:07.411373 2579 generic.go:358] "Generic (PLEG): container finished" podID="8e2935cf-4651-42ac-bd9e-54accc810a7a" containerID="4dde2d5ed4235dd873daf4ab176924918ce69a9ae3b044719d852eaf2bcd8741" exitCode=255 Apr 23 08:50:07.411481 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:07.411427 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-7ftql" event={"ID":"8e2935cf-4651-42ac-bd9e-54accc810a7a","Type":"ContainerDied","Data":"4dde2d5ed4235dd873daf4ab176924918ce69a9ae3b044719d852eaf2bcd8741"} Apr 23 08:50:07.411481 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:07.411452 2579 scope.go:117] "RemoveContainer" containerID="a46acc9e3cf371f1d84cd54a15ed692416c3fe8b8ba5fbd7822d98d865aa3da3" Apr 23 08:50:07.411752 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:07.411734 2579 scope.go:117] "RemoveContainer" containerID="4dde2d5ed4235dd873daf4ab176924918ce69a9ae3b044719d852eaf2bcd8741" Apr 23 08:50:07.411945 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:50:07.411926 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-7ftql_openshift-console-operator(8e2935cf-4651-42ac-bd9e-54accc810a7a)\"" pod="openshift-console-operator/console-operator-9d4b6777b-7ftql" podUID="8e2935cf-4651-42ac-bd9e-54accc810a7a" Apr 23 08:50:07.580779 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:07.580747 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-cmdv2"] Apr 23 08:50:07.583826 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:07.583810 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-cmdv2" Apr 23 08:50:07.586228 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:07.586209 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 23 08:50:07.586373 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:07.586214 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 23 08:50:07.586373 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:07.586329 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-9m644\"" Apr 23 08:50:07.593967 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:07.593949 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-cmdv2"] Apr 23 08:50:07.707661 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:07.707578 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bq9dq\" (UniqueName: \"kubernetes.io/projected/a9685c08-6c7b-40e4-971e-fd8cbcf83069-kube-api-access-bq9dq\") pod \"migrator-74bb7799d9-cmdv2\" (UID: \"a9685c08-6c7b-40e4-971e-fd8cbcf83069\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-cmdv2" Apr 23 08:50:07.808028 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:07.807987 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bq9dq\" (UniqueName: \"kubernetes.io/projected/a9685c08-6c7b-40e4-971e-fd8cbcf83069-kube-api-access-bq9dq\") pod \"migrator-74bb7799d9-cmdv2\" (UID: \"a9685c08-6c7b-40e4-971e-fd8cbcf83069\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-cmdv2" Apr 23 08:50:07.817008 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:07.816976 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bq9dq\" (UniqueName: \"kubernetes.io/projected/a9685c08-6c7b-40e4-971e-fd8cbcf83069-kube-api-access-bq9dq\") pod \"migrator-74bb7799d9-cmdv2\" (UID: \"a9685c08-6c7b-40e4-971e-fd8cbcf83069\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-cmdv2" Apr 23 08:50:07.892352 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:07.892320 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-cmdv2" Apr 23 08:50:08.009919 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:08.009857 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-cmdv2"] Apr 23 08:50:08.012152 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:50:08.012127 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9685c08_6c7b_40e4_971e_fd8cbcf83069.slice/crio-f045163311a22b7c83d6421193eeff0b39d3187564239d47a614483de918f594 WatchSource:0}: Error finding container f045163311a22b7c83d6421193eeff0b39d3187564239d47a614483de918f594: Status 404 returned error can't find the container with id f045163311a22b7c83d6421193eeff0b39d3187564239d47a614483de918f594 Apr 23 08:50:08.414542 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:08.414521 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7ftql_8e2935cf-4651-42ac-bd9e-54accc810a7a/console-operator/1.log" Apr 23 08:50:08.414948 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:08.414881 2579 scope.go:117] "RemoveContainer" containerID="4dde2d5ed4235dd873daf4ab176924918ce69a9ae3b044719d852eaf2bcd8741" Apr 23 08:50:08.415083 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:50:08.415064 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-7ftql_openshift-console-operator(8e2935cf-4651-42ac-bd9e-54accc810a7a)\"" pod="openshift-console-operator/console-operator-9d4b6777b-7ftql" podUID="8e2935cf-4651-42ac-bd9e-54accc810a7a" Apr 23 08:50:08.415575 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:08.415555 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-cmdv2" event={"ID":"a9685c08-6c7b-40e4-971e-fd8cbcf83069","Type":"ContainerStarted","Data":"f045163311a22b7c83d6421193eeff0b39d3187564239d47a614483de918f594"} Apr 23 08:50:09.421241 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:09.421210 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-cmdv2" event={"ID":"a9685c08-6c7b-40e4-971e-fd8cbcf83069","Type":"ContainerStarted","Data":"0431506c8b9ad2e092e50fa3c403853278a502944d49e6c22ca6655ad48cd2e9"} Apr 23 08:50:09.421241 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:09.421245 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-cmdv2" event={"ID":"a9685c08-6c7b-40e4-971e-fd8cbcf83069","Type":"ContainerStarted","Data":"c4c6401dcaa6cf3476adbc8b64609fb4f7448f4e71d23a589068a08927a40936"} Apr 23 08:50:09.436702 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:09.436656 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-cmdv2" podStartSLOduration=1.2595943 podStartE2EDuration="2.436643313s" podCreationTimestamp="2026-04-23 08:50:07 +0000 UTC" firstStartedPulling="2026-04-23 08:50:08.01437875 +0000 UTC m=+123.543858757" lastFinishedPulling="2026-04-23 08:50:09.191427764 +0000 UTC m=+124.720907770" observedRunningTime="2026-04-23 08:50:09.435615065 +0000 UTC m=+124.965095103" watchObservedRunningTime="2026-04-23 08:50:09.436643313 +0000 UTC m=+124.966123336" Apr 23 08:50:10.704078 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:10.704050 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-c68x8_6d76e4ef-b56a-40fd-9c37-6eb55602fea4/dns-node-resolver/0.log" Apr 23 08:50:10.831419 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:10.831386 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dcc4b0c4-24d9-42a8-826a-aeb82cce2a08-metrics-certs\") pod \"router-default-84bcc947b6-dxbqx\" (UID: \"dcc4b0c4-24d9-42a8-826a-aeb82cce2a08\") " pod="openshift-ingress/router-default-84bcc947b6-dxbqx" Apr 23 08:50:10.831581 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:10.831428 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dcc4b0c4-24d9-42a8-826a-aeb82cce2a08-service-ca-bundle\") pod \"router-default-84bcc947b6-dxbqx\" (UID: \"dcc4b0c4-24d9-42a8-826a-aeb82cce2a08\") " pod="openshift-ingress/router-default-84bcc947b6-dxbqx" Apr 23 08:50:10.831581 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:10.831468 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/b0e5f2a3-2fce-4078-99b7-c4be47e87be7-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-st52h\" (UID: \"b0e5f2a3-2fce-4078-99b7-c4be47e87be7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-st52h" Apr 23 08:50:10.831581 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:50:10.831525 2579 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 23 08:50:10.831581 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:50:10.831538 2579 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 23 08:50:10.831581 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:50:10.831577 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dcc4b0c4-24d9-42a8-826a-aeb82cce2a08-metrics-certs podName:dcc4b0c4-24d9-42a8-826a-aeb82cce2a08 nodeName:}" failed. No retries permitted until 2026-04-23 08:50:18.831563313 +0000 UTC m=+134.361043316 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dcc4b0c4-24d9-42a8-826a-aeb82cce2a08-metrics-certs") pod "router-default-84bcc947b6-dxbqx" (UID: "dcc4b0c4-24d9-42a8-826a-aeb82cce2a08") : secret "router-metrics-certs-default" not found Apr 23 08:50:10.831767 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:50:10.831652 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b0e5f2a3-2fce-4078-99b7-c4be47e87be7-samples-operator-tls podName:b0e5f2a3-2fce-4078-99b7-c4be47e87be7 nodeName:}" failed. No retries permitted until 2026-04-23 08:50:18.83163271 +0000 UTC m=+134.361112714 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/b0e5f2a3-2fce-4078-99b7-c4be47e87be7-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-st52h" (UID: "b0e5f2a3-2fce-4078-99b7-c4be47e87be7") : secret "samples-operator-tls" not found Apr 23 08:50:10.831767 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:50:10.831683 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/dcc4b0c4-24d9-42a8-826a-aeb82cce2a08-service-ca-bundle podName:dcc4b0c4-24d9-42a8-826a-aeb82cce2a08 nodeName:}" failed. No retries permitted until 2026-04-23 08:50:18.831673089 +0000 UTC m=+134.361153097 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/dcc4b0c4-24d9-42a8-826a-aeb82cce2a08-service-ca-bundle") pod "router-default-84bcc947b6-dxbqx" (UID: "dcc4b0c4-24d9-42a8-826a-aeb82cce2a08") : configmap references non-existent config key: service-ca.crt Apr 23 08:50:10.932151 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:10.932127 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/450d7653-7e13-4231-acf1-3736103fe1fd-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-qh5vs\" (UID: \"450d7653-7e13-4231-acf1-3736103fe1fd\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qh5vs" Apr 23 08:50:10.932281 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:50:10.932265 2579 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 23 08:50:10.932329 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:50:10.932320 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/450d7653-7e13-4231-acf1-3736103fe1fd-cluster-monitoring-operator-tls podName:450d7653-7e13-4231-acf1-3736103fe1fd nodeName:}" failed. No retries permitted until 2026-04-23 08:50:18.932305885 +0000 UTC m=+134.461785890 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/450d7653-7e13-4231-acf1-3736103fe1fd-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-qh5vs" (UID: "450d7653-7e13-4231-acf1-3736103fe1fd") : secret "cluster-monitoring-operator-tls" not found Apr 23 08:50:11.904385 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:11.904331 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-z6xdx_a0aeb874-3a11-4639-a579-36512ba94069/node-ca/0.log" Apr 23 08:50:12.904474 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:12.904439 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-cmdv2_a9685c08-6c7b-40e4-971e-fd8cbcf83069/migrator/0.log" Apr 23 08:50:13.103908 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:13.103882 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-cmdv2_a9685c08-6c7b-40e4-971e-fd8cbcf83069/graceful-termination/0.log" Apr 23 08:50:13.367863 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:13.367785 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-7ftql" Apr 23 08:50:13.367863 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:13.367817 2579 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-7ftql" Apr 23 08:50:13.368135 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:13.368123 2579 scope.go:117] "RemoveContainer" containerID="4dde2d5ed4235dd873daf4ab176924918ce69a9ae3b044719d852eaf2bcd8741" Apr 23 08:50:13.368284 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:50:13.368269 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-7ftql_openshift-console-operator(8e2935cf-4651-42ac-bd9e-54accc810a7a)\"" pod="openshift-console-operator/console-operator-9d4b6777b-7ftql" podUID="8e2935cf-4651-42ac-bd9e-54accc810a7a" Apr 23 08:50:13.855944 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:13.855911 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f3e78a7f-f7cc-48d6-a29a-7d418c195aef-metrics-certs\") pod \"network-metrics-daemon-96m6d\" (UID: \"f3e78a7f-f7cc-48d6-a29a-7d418c195aef\") " pod="openshift-multus/network-metrics-daemon-96m6d" Apr 23 08:50:13.856116 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:50:13.856049 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 23 08:50:13.856116 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:50:13.856108 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3e78a7f-f7cc-48d6-a29a-7d418c195aef-metrics-certs podName:f3e78a7f-f7cc-48d6-a29a-7d418c195aef nodeName:}" failed. No retries permitted until 2026-04-23 08:52:15.856092304 +0000 UTC m=+251.385572312 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f3e78a7f-f7cc-48d6-a29a-7d418c195aef-metrics-certs") pod "network-metrics-daemon-96m6d" (UID: "f3e78a7f-f7cc-48d6-a29a-7d418c195aef") : secret "metrics-daemon-secret" not found Apr 23 08:50:18.899977 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:18.899936 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dcc4b0c4-24d9-42a8-826a-aeb82cce2a08-metrics-certs\") pod \"router-default-84bcc947b6-dxbqx\" (UID: \"dcc4b0c4-24d9-42a8-826a-aeb82cce2a08\") " pod="openshift-ingress/router-default-84bcc947b6-dxbqx" Apr 23 08:50:18.900476 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:18.899988 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dcc4b0c4-24d9-42a8-826a-aeb82cce2a08-service-ca-bundle\") pod \"router-default-84bcc947b6-dxbqx\" (UID: \"dcc4b0c4-24d9-42a8-826a-aeb82cce2a08\") " pod="openshift-ingress/router-default-84bcc947b6-dxbqx" Apr 23 08:50:18.900476 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:18.900033 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/b0e5f2a3-2fce-4078-99b7-c4be47e87be7-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-st52h\" (UID: \"b0e5f2a3-2fce-4078-99b7-c4be47e87be7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-st52h" Apr 23 08:50:18.900759 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:18.900733 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dcc4b0c4-24d9-42a8-826a-aeb82cce2a08-service-ca-bundle\") pod \"router-default-84bcc947b6-dxbqx\" (UID: \"dcc4b0c4-24d9-42a8-826a-aeb82cce2a08\") " pod="openshift-ingress/router-default-84bcc947b6-dxbqx" Apr 23 08:50:18.902417 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:18.902398 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dcc4b0c4-24d9-42a8-826a-aeb82cce2a08-metrics-certs\") pod \"router-default-84bcc947b6-dxbqx\" (UID: \"dcc4b0c4-24d9-42a8-826a-aeb82cce2a08\") " pod="openshift-ingress/router-default-84bcc947b6-dxbqx" Apr 23 08:50:18.902473 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:18.902422 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/b0e5f2a3-2fce-4078-99b7-c4be47e87be7-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-st52h\" (UID: \"b0e5f2a3-2fce-4078-99b7-c4be47e87be7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-st52h" Apr 23 08:50:18.976984 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:18.976959 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-r69jp\"" Apr 23 08:50:18.981381 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:18.981365 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-g2b9m\"" Apr 23 08:50:18.985791 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:18.985779 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-st52h" Apr 23 08:50:18.990433 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:18.990414 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-84bcc947b6-dxbqx" Apr 23 08:50:19.000440 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:19.000398 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/450d7653-7e13-4231-acf1-3736103fe1fd-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-qh5vs\" (UID: \"450d7653-7e13-4231-acf1-3736103fe1fd\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qh5vs" Apr 23 08:50:19.000568 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:50:19.000550 2579 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 23 08:50:19.000631 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:50:19.000623 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/450d7653-7e13-4231-acf1-3736103fe1fd-cluster-monitoring-operator-tls podName:450d7653-7e13-4231-acf1-3736103fe1fd nodeName:}" failed. No retries permitted until 2026-04-23 08:50:35.000608735 +0000 UTC m=+150.530088740 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/450d7653-7e13-4231-acf1-3736103fe1fd-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-qh5vs" (UID: "450d7653-7e13-4231-acf1-3736103fe1fd") : secret "cluster-monitoring-operator-tls" not found Apr 23 08:50:19.106706 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:19.106678 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-st52h"] Apr 23 08:50:19.123236 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:19.123212 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-84bcc947b6-dxbqx"] Apr 23 08:50:19.129570 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:50:19.129541 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddcc4b0c4_24d9_42a8_826a_aeb82cce2a08.slice/crio-da8f49846fe59fc210f52f7f016fa3c5120022b38214b62e9f0abbb56af26404 WatchSource:0}: Error finding container da8f49846fe59fc210f52f7f016fa3c5120022b38214b62e9f0abbb56af26404: Status 404 returned error can't find the container with id da8f49846fe59fc210f52f7f016fa3c5120022b38214b62e9f0abbb56af26404 Apr 23 08:50:19.443332 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:19.443229 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-st52h" event={"ID":"b0e5f2a3-2fce-4078-99b7-c4be47e87be7","Type":"ContainerStarted","Data":"cbf75168aeabcd9f14aabe16818640a7f7aaac77fc0f11b00f03460e470e6b48"} Apr 23 08:50:19.444490 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:19.444467 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-84bcc947b6-dxbqx" event={"ID":"dcc4b0c4-24d9-42a8-826a-aeb82cce2a08","Type":"ContainerStarted","Data":"e48edc6b16399d87ab335fb03f0398ed9d8281350671cb1e26afb1cbabc84f17"} Apr 23 08:50:19.444490 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:19.444494 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-84bcc947b6-dxbqx" event={"ID":"dcc4b0c4-24d9-42a8-826a-aeb82cce2a08","Type":"ContainerStarted","Data":"da8f49846fe59fc210f52f7f016fa3c5120022b38214b62e9f0abbb56af26404"} Apr 23 08:50:19.464463 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:19.464421 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-84bcc947b6-dxbqx" podStartSLOduration=16.464408926 podStartE2EDuration="16.464408926s" podCreationTimestamp="2026-04-23 08:50:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 08:50:19.463469823 +0000 UTC m=+134.992949849" watchObservedRunningTime="2026-04-23 08:50:19.464408926 +0000 UTC m=+134.993888943" Apr 23 08:50:19.991425 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:19.991387 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-84bcc947b6-dxbqx" Apr 23 08:50:19.994243 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:19.994222 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-84bcc947b6-dxbqx" Apr 23 08:50:20.447269 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:20.447228 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-84bcc947b6-dxbqx" Apr 23 08:50:20.448414 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:20.448393 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-84bcc947b6-dxbqx" Apr 23 08:50:21.453567 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:21.453528 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-st52h" event={"ID":"b0e5f2a3-2fce-4078-99b7-c4be47e87be7","Type":"ContainerStarted","Data":"3cd853056dbdf6af7845db542673e7a38f6057951c689d3a7d176cd641bcbec6"} Apr 23 08:50:21.453567 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:21.453576 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-st52h" event={"ID":"b0e5f2a3-2fce-4078-99b7-c4be47e87be7","Type":"ContainerStarted","Data":"f0dc6b76ce09f3ce8ff95cee43f95c587b31366460fbad32f24474d8f3bf938c"} Apr 23 08:50:21.469512 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:21.469470 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-st52h" podStartSLOduration=16.652296466 podStartE2EDuration="18.469457308s" podCreationTimestamp="2026-04-23 08:50:03 +0000 UTC" firstStartedPulling="2026-04-23 08:50:19.164719913 +0000 UTC m=+134.694199920" lastFinishedPulling="2026-04-23 08:50:20.981880754 +0000 UTC m=+136.511360762" observedRunningTime="2026-04-23 08:50:21.468763086 +0000 UTC m=+136.998243112" watchObservedRunningTime="2026-04-23 08:50:21.469457308 +0000 UTC m=+136.998937333" Apr 23 08:50:27.012453 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:27.012421 2579 scope.go:117] "RemoveContainer" containerID="4dde2d5ed4235dd873daf4ab176924918ce69a9ae3b044719d852eaf2bcd8741" Apr 23 08:50:27.471172 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:27.471145 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7ftql_8e2935cf-4651-42ac-bd9e-54accc810a7a/console-operator/1.log" Apr 23 08:50:27.471358 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:27.471195 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-7ftql" event={"ID":"8e2935cf-4651-42ac-bd9e-54accc810a7a","Type":"ContainerStarted","Data":"a50e285b44202236693a3c88f9840f1c1435b47340a335f6de250f3c752117c7"} Apr 23 08:50:27.471579 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:27.471550 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-7ftql" Apr 23 08:50:27.487217 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:27.487174 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-7ftql" podStartSLOduration=22.531689273 podStartE2EDuration="24.487163454s" podCreationTimestamp="2026-04-23 08:50:03 +0000 UTC" firstStartedPulling="2026-04-23 08:50:03.481983049 +0000 UTC m=+119.011463056" lastFinishedPulling="2026-04-23 08:50:05.437457219 +0000 UTC m=+120.966937237" observedRunningTime="2026-04-23 08:50:27.486651786 +0000 UTC m=+143.016131812" watchObservedRunningTime="2026-04-23 08:50:27.487163454 +0000 UTC m=+143.016643528" Apr 23 08:50:27.735904 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:27.735828 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-7ftql" Apr 23 08:50:33.902653 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:33.902603 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-gk2qp"] Apr 23 08:50:33.905680 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:33.905655 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-gk2qp" Apr 23 08:50:33.908329 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:33.908307 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 23 08:50:33.908895 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:33.908877 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-wnmgb\"" Apr 23 08:50:33.909091 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:33.909069 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 23 08:50:33.909188 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:33.909072 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 23 08:50:33.909188 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:33.909098 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 23 08:50:33.918566 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:33.918542 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-gk2qp"] Apr 23 08:50:33.981252 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:33.981219 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-4f5nb"] Apr 23 08:50:33.983835 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:33.983813 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-4f5nb" Apr 23 08:50:33.987257 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:33.987237 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-6d4d878cc9-fkk7r"] Apr 23 08:50:33.988249 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:33.988231 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 23 08:50:33.988406 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:33.988389 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-s7pmx\"" Apr 23 08:50:33.988533 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:33.988518 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 23 08:50:33.989721 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:33.989703 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6d4d878cc9-fkk7r" Apr 23 08:50:33.992908 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:33.992890 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 23 08:50:33.993003 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:33.992890 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-tpxkg\"" Apr 23 08:50:33.993101 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:33.993085 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 23 08:50:33.993170 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:33.993091 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 23 08:50:34.001969 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:34.001950 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-4f5nb"] Apr 23 08:50:34.005632 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:34.005611 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 23 08:50:34.008924 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:34.008906 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6d4d878cc9-fkk7r"] Apr 23 08:50:34.013275 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:34.013256 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/6df0870f-1da9-4821-9995-098c84fe5be1-data-volume\") pod \"insights-runtime-extractor-gk2qp\" (UID: \"6df0870f-1da9-4821-9995-098c84fe5be1\") " pod="openshift-insights/insights-runtime-extractor-gk2qp" Apr 23 08:50:34.013384 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:34.013287 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/6df0870f-1da9-4821-9995-098c84fe5be1-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-gk2qp\" (UID: \"6df0870f-1da9-4821-9995-098c84fe5be1\") " pod="openshift-insights/insights-runtime-extractor-gk2qp" Apr 23 08:50:34.013384 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:34.013325 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfjjd\" (UniqueName: \"kubernetes.io/projected/6df0870f-1da9-4821-9995-098c84fe5be1-kube-api-access-rfjjd\") pod \"insights-runtime-extractor-gk2qp\" (UID: \"6df0870f-1da9-4821-9995-098c84fe5be1\") " pod="openshift-insights/insights-runtime-extractor-gk2qp" Apr 23 08:50:34.013384 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:34.013366 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pg69j\" (UniqueName: \"kubernetes.io/projected/6da55603-9712-4719-b29b-d9feec7ef27f-kube-api-access-pg69j\") pod \"downloads-6bcc868b7-4f5nb\" (UID: \"6da55603-9712-4719-b29b-d9feec7ef27f\") " pod="openshift-console/downloads-6bcc868b7-4f5nb" Apr 23 08:50:34.013502 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:34.013403 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/6df0870f-1da9-4821-9995-098c84fe5be1-crio-socket\") pod \"insights-runtime-extractor-gk2qp\" (UID: \"6df0870f-1da9-4821-9995-098c84fe5be1\") " pod="openshift-insights/insights-runtime-extractor-gk2qp" Apr 23 08:50:34.013502 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:34.013487 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/6df0870f-1da9-4821-9995-098c84fe5be1-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-gk2qp\" (UID: \"6df0870f-1da9-4821-9995-098c84fe5be1\") " pod="openshift-insights/insights-runtime-extractor-gk2qp" Apr 23 08:50:34.114508 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:34.114478 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/6df0870f-1da9-4821-9995-098c84fe5be1-data-volume\") pod \"insights-runtime-extractor-gk2qp\" (UID: \"6df0870f-1da9-4821-9995-098c84fe5be1\") " pod="openshift-insights/insights-runtime-extractor-gk2qp" Apr 23 08:50:34.114650 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:34.114516 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3dc2497b-5b53-4e93-ab87-8509553bef5d-ca-trust-extracted\") pod \"image-registry-6d4d878cc9-fkk7r\" (UID: \"3dc2497b-5b53-4e93-ab87-8509553bef5d\") " pod="openshift-image-registry/image-registry-6d4d878cc9-fkk7r" Apr 23 08:50:34.114650 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:34.114547 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/6df0870f-1da9-4821-9995-098c84fe5be1-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-gk2qp\" (UID: \"6df0870f-1da9-4821-9995-098c84fe5be1\") " pod="openshift-insights/insights-runtime-extractor-gk2qp" Apr 23 08:50:34.114650 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:34.114593 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/3dc2497b-5b53-4e93-ab87-8509553bef5d-image-registry-private-configuration\") pod \"image-registry-6d4d878cc9-fkk7r\" (UID: \"3dc2497b-5b53-4e93-ab87-8509553bef5d\") " pod="openshift-image-registry/image-registry-6d4d878cc9-fkk7r" Apr 23 08:50:34.114650 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:34.114624 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3dc2497b-5b53-4e93-ab87-8509553bef5d-registry-certificates\") pod \"image-registry-6d4d878cc9-fkk7r\" (UID: \"3dc2497b-5b53-4e93-ab87-8509553bef5d\") " pod="openshift-image-registry/image-registry-6d4d878cc9-fkk7r" Apr 23 08:50:34.114880 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:34.114702 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3dc2497b-5b53-4e93-ab87-8509553bef5d-registry-tls\") pod \"image-registry-6d4d878cc9-fkk7r\" (UID: \"3dc2497b-5b53-4e93-ab87-8509553bef5d\") " pod="openshift-image-registry/image-registry-6d4d878cc9-fkk7r" Apr 23 08:50:34.114880 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:34.114749 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3dc2497b-5b53-4e93-ab87-8509553bef5d-installation-pull-secrets\") pod \"image-registry-6d4d878cc9-fkk7r\" (UID: \"3dc2497b-5b53-4e93-ab87-8509553bef5d\") " pod="openshift-image-registry/image-registry-6d4d878cc9-fkk7r" Apr 23 08:50:34.114880 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:34.114805 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rfjjd\" (UniqueName: \"kubernetes.io/projected/6df0870f-1da9-4821-9995-098c84fe5be1-kube-api-access-rfjjd\") pod \"insights-runtime-extractor-gk2qp\" (UID: \"6df0870f-1da9-4821-9995-098c84fe5be1\") " pod="openshift-insights/insights-runtime-extractor-gk2qp" Apr 23 08:50:34.114880 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:34.114832 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pg69j\" (UniqueName: \"kubernetes.io/projected/6da55603-9712-4719-b29b-d9feec7ef27f-kube-api-access-pg69j\") pod \"downloads-6bcc868b7-4f5nb\" (UID: \"6da55603-9712-4719-b29b-d9feec7ef27f\") " pod="openshift-console/downloads-6bcc868b7-4f5nb" Apr 23 08:50:34.114880 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:34.114859 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/6df0870f-1da9-4821-9995-098c84fe5be1-crio-socket\") pod \"insights-runtime-extractor-gk2qp\" (UID: \"6df0870f-1da9-4821-9995-098c84fe5be1\") " pod="openshift-insights/insights-runtime-extractor-gk2qp" Apr 23 08:50:34.114880 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:34.114873 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/6df0870f-1da9-4821-9995-098c84fe5be1-data-volume\") pod \"insights-runtime-extractor-gk2qp\" (UID: \"6df0870f-1da9-4821-9995-098c84fe5be1\") " pod="openshift-insights/insights-runtime-extractor-gk2qp" Apr 23 08:50:34.115199 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:34.114951 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/6df0870f-1da9-4821-9995-098c84fe5be1-crio-socket\") pod \"insights-runtime-extractor-gk2qp\" (UID: \"6df0870f-1da9-4821-9995-098c84fe5be1\") " pod="openshift-insights/insights-runtime-extractor-gk2qp" Apr 23 08:50:34.115199 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:34.114953 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/6df0870f-1da9-4821-9995-098c84fe5be1-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-gk2qp\" (UID: \"6df0870f-1da9-4821-9995-098c84fe5be1\") " pod="openshift-insights/insights-runtime-extractor-gk2qp" Apr 23 08:50:34.115199 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:34.114989 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l29sk\" (UniqueName: \"kubernetes.io/projected/3dc2497b-5b53-4e93-ab87-8509553bef5d-kube-api-access-l29sk\") pod \"image-registry-6d4d878cc9-fkk7r\" (UID: \"3dc2497b-5b53-4e93-ab87-8509553bef5d\") " pod="openshift-image-registry/image-registry-6d4d878cc9-fkk7r" Apr 23 08:50:34.115199 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:34.115034 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3dc2497b-5b53-4e93-ab87-8509553bef5d-trusted-ca\") pod \"image-registry-6d4d878cc9-fkk7r\" (UID: \"3dc2497b-5b53-4e93-ab87-8509553bef5d\") " pod="openshift-image-registry/image-registry-6d4d878cc9-fkk7r" Apr 23 08:50:34.115199 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:34.115067 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3dc2497b-5b53-4e93-ab87-8509553bef5d-bound-sa-token\") pod \"image-registry-6d4d878cc9-fkk7r\" (UID: \"3dc2497b-5b53-4e93-ab87-8509553bef5d\") " pod="openshift-image-registry/image-registry-6d4d878cc9-fkk7r" Apr 23 08:50:34.115199 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:34.115137 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/6df0870f-1da9-4821-9995-098c84fe5be1-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-gk2qp\" (UID: \"6df0870f-1da9-4821-9995-098c84fe5be1\") " pod="openshift-insights/insights-runtime-extractor-gk2qp" Apr 23 08:50:34.117248 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:34.117222 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/6df0870f-1da9-4821-9995-098c84fe5be1-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-gk2qp\" (UID: \"6df0870f-1da9-4821-9995-098c84fe5be1\") " pod="openshift-insights/insights-runtime-extractor-gk2qp" Apr 23 08:50:34.127212 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:34.127186 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pg69j\" (UniqueName: \"kubernetes.io/projected/6da55603-9712-4719-b29b-d9feec7ef27f-kube-api-access-pg69j\") pod \"downloads-6bcc868b7-4f5nb\" (UID: \"6da55603-9712-4719-b29b-d9feec7ef27f\") " pod="openshift-console/downloads-6bcc868b7-4f5nb" Apr 23 08:50:34.127316 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:34.127301 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfjjd\" (UniqueName: \"kubernetes.io/projected/6df0870f-1da9-4821-9995-098c84fe5be1-kube-api-access-rfjjd\") pod \"insights-runtime-extractor-gk2qp\" (UID: \"6df0870f-1da9-4821-9995-098c84fe5be1\") " pod="openshift-insights/insights-runtime-extractor-gk2qp" Apr 23 08:50:34.216032 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:34.215952 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-gk2qp" Apr 23 08:50:34.216167 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:34.216061 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l29sk\" (UniqueName: \"kubernetes.io/projected/3dc2497b-5b53-4e93-ab87-8509553bef5d-kube-api-access-l29sk\") pod \"image-registry-6d4d878cc9-fkk7r\" (UID: \"3dc2497b-5b53-4e93-ab87-8509553bef5d\") " pod="openshift-image-registry/image-registry-6d4d878cc9-fkk7r" Apr 23 08:50:34.216167 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:34.216095 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3dc2497b-5b53-4e93-ab87-8509553bef5d-trusted-ca\") pod \"image-registry-6d4d878cc9-fkk7r\" (UID: \"3dc2497b-5b53-4e93-ab87-8509553bef5d\") " pod="openshift-image-registry/image-registry-6d4d878cc9-fkk7r" Apr 23 08:50:34.216167 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:34.216114 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3dc2497b-5b53-4e93-ab87-8509553bef5d-bound-sa-token\") pod \"image-registry-6d4d878cc9-fkk7r\" (UID: \"3dc2497b-5b53-4e93-ab87-8509553bef5d\") " pod="openshift-image-registry/image-registry-6d4d878cc9-fkk7r" Apr 23 08:50:34.216328 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:34.216240 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3dc2497b-5b53-4e93-ab87-8509553bef5d-ca-trust-extracted\") pod \"image-registry-6d4d878cc9-fkk7r\" (UID: \"3dc2497b-5b53-4e93-ab87-8509553bef5d\") " pod="openshift-image-registry/image-registry-6d4d878cc9-fkk7r" Apr 23 08:50:34.216328 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:34.216272 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/3dc2497b-5b53-4e93-ab87-8509553bef5d-image-registry-private-configuration\") pod \"image-registry-6d4d878cc9-fkk7r\" (UID: \"3dc2497b-5b53-4e93-ab87-8509553bef5d\") " pod="openshift-image-registry/image-registry-6d4d878cc9-fkk7r" Apr 23 08:50:34.216328 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:34.216294 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3dc2497b-5b53-4e93-ab87-8509553bef5d-registry-certificates\") pod \"image-registry-6d4d878cc9-fkk7r\" (UID: \"3dc2497b-5b53-4e93-ab87-8509553bef5d\") " pod="openshift-image-registry/image-registry-6d4d878cc9-fkk7r" Apr 23 08:50:34.216328 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:34.216329 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3dc2497b-5b53-4e93-ab87-8509553bef5d-registry-tls\") pod \"image-registry-6d4d878cc9-fkk7r\" (UID: \"3dc2497b-5b53-4e93-ab87-8509553bef5d\") " pod="openshift-image-registry/image-registry-6d4d878cc9-fkk7r" Apr 23 08:50:34.216552 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:34.216380 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3dc2497b-5b53-4e93-ab87-8509553bef5d-installation-pull-secrets\") pod \"image-registry-6d4d878cc9-fkk7r\" (UID: \"3dc2497b-5b53-4e93-ab87-8509553bef5d\") " pod="openshift-image-registry/image-registry-6d4d878cc9-fkk7r" Apr 23 08:50:34.216995 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:34.216655 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3dc2497b-5b53-4e93-ab87-8509553bef5d-ca-trust-extracted\") pod \"image-registry-6d4d878cc9-fkk7r\" (UID: \"3dc2497b-5b53-4e93-ab87-8509553bef5d\") " pod="openshift-image-registry/image-registry-6d4d878cc9-fkk7r" Apr 23 08:50:34.217195 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:34.217166 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3dc2497b-5b53-4e93-ab87-8509553bef5d-registry-certificates\") pod \"image-registry-6d4d878cc9-fkk7r\" (UID: \"3dc2497b-5b53-4e93-ab87-8509553bef5d\") " pod="openshift-image-registry/image-registry-6d4d878cc9-fkk7r" Apr 23 08:50:34.217556 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:34.217530 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3dc2497b-5b53-4e93-ab87-8509553bef5d-trusted-ca\") pod \"image-registry-6d4d878cc9-fkk7r\" (UID: \"3dc2497b-5b53-4e93-ab87-8509553bef5d\") " pod="openshift-image-registry/image-registry-6d4d878cc9-fkk7r" Apr 23 08:50:34.219196 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:34.219170 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3dc2497b-5b53-4e93-ab87-8509553bef5d-registry-tls\") pod \"image-registry-6d4d878cc9-fkk7r\" (UID: \"3dc2497b-5b53-4e93-ab87-8509553bef5d\") " pod="openshift-image-registry/image-registry-6d4d878cc9-fkk7r" Apr 23 08:50:34.219300 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:34.219181 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/3dc2497b-5b53-4e93-ab87-8509553bef5d-image-registry-private-configuration\") pod \"image-registry-6d4d878cc9-fkk7r\" (UID: \"3dc2497b-5b53-4e93-ab87-8509553bef5d\") " pod="openshift-image-registry/image-registry-6d4d878cc9-fkk7r" Apr 23 08:50:34.219381 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:34.219333 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3dc2497b-5b53-4e93-ab87-8509553bef5d-installation-pull-secrets\") pod \"image-registry-6d4d878cc9-fkk7r\" (UID: \"3dc2497b-5b53-4e93-ab87-8509553bef5d\") " pod="openshift-image-registry/image-registry-6d4d878cc9-fkk7r" Apr 23 08:50:34.227312 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:34.227276 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l29sk\" (UniqueName: \"kubernetes.io/projected/3dc2497b-5b53-4e93-ab87-8509553bef5d-kube-api-access-l29sk\") pod \"image-registry-6d4d878cc9-fkk7r\" (UID: \"3dc2497b-5b53-4e93-ab87-8509553bef5d\") " pod="openshift-image-registry/image-registry-6d4d878cc9-fkk7r" Apr 23 08:50:34.227562 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:34.227540 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3dc2497b-5b53-4e93-ab87-8509553bef5d-bound-sa-token\") pod \"image-registry-6d4d878cc9-fkk7r\" (UID: \"3dc2497b-5b53-4e93-ab87-8509553bef5d\") " pod="openshift-image-registry/image-registry-6d4d878cc9-fkk7r" Apr 23 08:50:34.295662 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:34.295640 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-4f5nb" Apr 23 08:50:34.301730 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:34.301707 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6d4d878cc9-fkk7r" Apr 23 08:50:34.334463 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:34.334407 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-gk2qp"] Apr 23 08:50:34.336564 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:50:34.336532 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6df0870f_1da9_4821_9995_098c84fe5be1.slice/crio-00d5dc026fb28351add0983616126baf900b2fa523ed9075fa96f4deb4320f03 WatchSource:0}: Error finding container 00d5dc026fb28351add0983616126baf900b2fa523ed9075fa96f4deb4320f03: Status 404 returned error can't find the container with id 00d5dc026fb28351add0983616126baf900b2fa523ed9075fa96f4deb4320f03 Apr 23 08:50:34.427976 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:34.427938 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-4f5nb"] Apr 23 08:50:34.430916 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:50:34.430886 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6da55603_9712_4719_b29b_d9feec7ef27f.slice/crio-72239b986ce928f1d9fe50f157e987907ea5952d526bb46894ac2c56908ccc3b WatchSource:0}: Error finding container 72239b986ce928f1d9fe50f157e987907ea5952d526bb46894ac2c56908ccc3b: Status 404 returned error can't find the container with id 72239b986ce928f1d9fe50f157e987907ea5952d526bb46894ac2c56908ccc3b Apr 23 08:50:34.450764 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:34.450742 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6d4d878cc9-fkk7r"] Apr 23 08:50:34.453535 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:50:34.453506 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dc2497b_5b53_4e93_ab87_8509553bef5d.slice/crio-a519d94eb0a6c8a9711d4dd0bddc0e643cbc937f9cb57e3488a20279ecf328a3 WatchSource:0}: Error finding container a519d94eb0a6c8a9711d4dd0bddc0e643cbc937f9cb57e3488a20279ecf328a3: Status 404 returned error can't find the container with id a519d94eb0a6c8a9711d4dd0bddc0e643cbc937f9cb57e3488a20279ecf328a3 Apr 23 08:50:34.488572 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:34.488536 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6d4d878cc9-fkk7r" event={"ID":"3dc2497b-5b53-4e93-ab87-8509553bef5d","Type":"ContainerStarted","Data":"a519d94eb0a6c8a9711d4dd0bddc0e643cbc937f9cb57e3488a20279ecf328a3"} Apr 23 08:50:34.489517 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:34.489493 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-4f5nb" event={"ID":"6da55603-9712-4719-b29b-d9feec7ef27f","Type":"ContainerStarted","Data":"72239b986ce928f1d9fe50f157e987907ea5952d526bb46894ac2c56908ccc3b"} Apr 23 08:50:34.490767 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:34.490746 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-gk2qp" event={"ID":"6df0870f-1da9-4821-9995-098c84fe5be1","Type":"ContainerStarted","Data":"02a099cba63cb4fb6ea2cc1fe6d414c27750f4600291cb510130b975a8777db3"} Apr 23 08:50:34.490828 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:34.490773 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-gk2qp" event={"ID":"6df0870f-1da9-4821-9995-098c84fe5be1","Type":"ContainerStarted","Data":"00d5dc026fb28351add0983616126baf900b2fa523ed9075fa96f4deb4320f03"} Apr 23 08:50:35.023439 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:35.023402 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/450d7653-7e13-4231-acf1-3736103fe1fd-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-qh5vs\" (UID: \"450d7653-7e13-4231-acf1-3736103fe1fd\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qh5vs" Apr 23 08:50:35.026221 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:35.026192 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/450d7653-7e13-4231-acf1-3736103fe1fd-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-qh5vs\" (UID: \"450d7653-7e13-4231-acf1-3736103fe1fd\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qh5vs" Apr 23 08:50:35.265560 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:35.265536 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-68f6g\"" Apr 23 08:50:35.274304 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:35.274247 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qh5vs" Apr 23 08:50:35.415454 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:35.415423 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-qh5vs"] Apr 23 08:50:35.418458 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:50:35.418429 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod450d7653_7e13_4231_acf1_3736103fe1fd.slice/crio-143c81590965b68bac71f7d4fd96d5c3af08888e536844bc72782d952f8f919d WatchSource:0}: Error finding container 143c81590965b68bac71f7d4fd96d5c3af08888e536844bc72782d952f8f919d: Status 404 returned error can't find the container with id 143c81590965b68bac71f7d4fd96d5c3af08888e536844bc72782d952f8f919d Apr 23 08:50:35.495132 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:35.495103 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-gk2qp" event={"ID":"6df0870f-1da9-4821-9995-098c84fe5be1","Type":"ContainerStarted","Data":"ced23c2769e8a0be72b72099480383fd97610becb24051c67a4c0a959224ca8d"} Apr 23 08:50:35.496383 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:35.496323 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qh5vs" event={"ID":"450d7653-7e13-4231-acf1-3736103fe1fd","Type":"ContainerStarted","Data":"143c81590965b68bac71f7d4fd96d5c3af08888e536844bc72782d952f8f919d"} Apr 23 08:50:35.497601 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:35.497580 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6d4d878cc9-fkk7r" event={"ID":"3dc2497b-5b53-4e93-ab87-8509553bef5d","Type":"ContainerStarted","Data":"9c9846e4e94489b8c33e4ef8fc8d3206a599c9f9470c262fac689db075b7b0cb"} Apr 23 08:50:35.497762 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:35.497743 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-6d4d878cc9-fkk7r" Apr 23 08:50:35.516061 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:35.516010 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-6d4d878cc9-fkk7r" podStartSLOduration=2.515995255 podStartE2EDuration="2.515995255s" podCreationTimestamp="2026-04-23 08:50:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 08:50:35.51553788 +0000 UTC m=+151.045017906" watchObservedRunningTime="2026-04-23 08:50:35.515995255 +0000 UTC m=+151.045475284" Apr 23 08:50:37.504666 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:37.504624 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-gk2qp" event={"ID":"6df0870f-1da9-4821-9995-098c84fe5be1","Type":"ContainerStarted","Data":"6261c5626c2e78ac6b33b07b46f4bd6d3eba80059d01fd3d63831707409b8665"} Apr 23 08:50:37.521117 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:37.521071 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-gk2qp" podStartSLOduration=2.080277459 podStartE2EDuration="4.521053498s" podCreationTimestamp="2026-04-23 08:50:33 +0000 UTC" firstStartedPulling="2026-04-23 08:50:34.412916777 +0000 UTC m=+149.942396788" lastFinishedPulling="2026-04-23 08:50:36.853692811 +0000 UTC m=+152.383172827" observedRunningTime="2026-04-23 08:50:37.520148329 +0000 UTC m=+153.049628367" watchObservedRunningTime="2026-04-23 08:50:37.521053498 +0000 UTC m=+153.050533524" Apr 23 08:50:38.508991 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:38.508948 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qh5vs" event={"ID":"450d7653-7e13-4231-acf1-3736103fe1fd","Type":"ContainerStarted","Data":"bc34c024f5ad788dc8514bb1d44845cb0d59cfd786391a1d31cb474d8e187a6d"} Apr 23 08:50:38.526359 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:38.526301 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qh5vs" podStartSLOduration=33.438818052 podStartE2EDuration="35.526286448s" podCreationTimestamp="2026-04-23 08:50:03 +0000 UTC" firstStartedPulling="2026-04-23 08:50:35.420719716 +0000 UTC m=+150.950199903" lastFinishedPulling="2026-04-23 08:50:37.508188295 +0000 UTC m=+153.037668299" observedRunningTime="2026-04-23 08:50:38.525381072 +0000 UTC m=+154.054861098" watchObservedRunningTime="2026-04-23 08:50:38.526286448 +0000 UTC m=+154.055766504" Apr 23 08:50:40.924913 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:50:40.924862 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-sr575" podUID="533a84b7-1e94-4312-8393-c6c787af53b6" Apr 23 08:50:40.938048 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:50:40.938008 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-dd2cl" podUID="75c84a62-de6b-4301-9660-8d6ba1422a31" Apr 23 08:50:41.024592 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:50:41.024550 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-96m6d" podUID="f3e78a7f-f7cc-48d6-a29a-7d418c195aef" Apr 23 08:50:41.118509 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:41.118476 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-dgflf"] Apr 23 08:50:41.124114 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:41.124092 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-dgflf" Apr 23 08:50:41.126229 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:41.126202 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 23 08:50:41.126833 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:41.126801 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-88k9b\"" Apr 23 08:50:41.126948 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:41.126832 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 23 08:50:41.126948 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:41.126867 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 23 08:50:41.128997 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:41.128974 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-dgflf"] Apr 23 08:50:41.178005 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:41.177931 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/825d2a5f-bf4f-4230-a808-8ac94f92aa8b-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-dgflf\" (UID: \"825d2a5f-bf4f-4230-a808-8ac94f92aa8b\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-dgflf" Apr 23 08:50:41.178159 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:41.178012 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/825d2a5f-bf4f-4230-a808-8ac94f92aa8b-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-dgflf\" (UID: \"825d2a5f-bf4f-4230-a808-8ac94f92aa8b\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-dgflf" Apr 23 08:50:41.178159 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:41.178043 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/825d2a5f-bf4f-4230-a808-8ac94f92aa8b-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-dgflf\" (UID: \"825d2a5f-bf4f-4230-a808-8ac94f92aa8b\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-dgflf" Apr 23 08:50:41.178159 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:41.178124 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dwzd\" (UniqueName: \"kubernetes.io/projected/825d2a5f-bf4f-4230-a808-8ac94f92aa8b-kube-api-access-8dwzd\") pod \"prometheus-operator-5676c8c784-dgflf\" (UID: \"825d2a5f-bf4f-4230-a808-8ac94f92aa8b\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-dgflf" Apr 23 08:50:41.279052 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:41.279013 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8dwzd\" (UniqueName: \"kubernetes.io/projected/825d2a5f-bf4f-4230-a808-8ac94f92aa8b-kube-api-access-8dwzd\") pod \"prometheus-operator-5676c8c784-dgflf\" (UID: \"825d2a5f-bf4f-4230-a808-8ac94f92aa8b\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-dgflf" Apr 23 08:50:41.279222 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:41.279080 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/825d2a5f-bf4f-4230-a808-8ac94f92aa8b-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-dgflf\" (UID: \"825d2a5f-bf4f-4230-a808-8ac94f92aa8b\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-dgflf" Apr 23 08:50:41.279222 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:41.279120 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/825d2a5f-bf4f-4230-a808-8ac94f92aa8b-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-dgflf\" (UID: \"825d2a5f-bf4f-4230-a808-8ac94f92aa8b\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-dgflf" Apr 23 08:50:41.279222 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:41.279154 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/825d2a5f-bf4f-4230-a808-8ac94f92aa8b-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-dgflf\" (UID: \"825d2a5f-bf4f-4230-a808-8ac94f92aa8b\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-dgflf" Apr 23 08:50:41.279395 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:50:41.279317 2579 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 23 08:50:41.279443 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:50:41.279425 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/825d2a5f-bf4f-4230-a808-8ac94f92aa8b-prometheus-operator-tls podName:825d2a5f-bf4f-4230-a808-8ac94f92aa8b nodeName:}" failed. No retries permitted until 2026-04-23 08:50:41.779394875 +0000 UTC m=+157.308874890 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/825d2a5f-bf4f-4230-a808-8ac94f92aa8b-prometheus-operator-tls") pod "prometheus-operator-5676c8c784-dgflf" (UID: "825d2a5f-bf4f-4230-a808-8ac94f92aa8b") : secret "prometheus-operator-tls" not found Apr 23 08:50:41.279898 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:41.279877 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/825d2a5f-bf4f-4230-a808-8ac94f92aa8b-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-dgflf\" (UID: \"825d2a5f-bf4f-4230-a808-8ac94f92aa8b\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-dgflf" Apr 23 08:50:41.281831 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:41.281807 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/825d2a5f-bf4f-4230-a808-8ac94f92aa8b-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-dgflf\" (UID: \"825d2a5f-bf4f-4230-a808-8ac94f92aa8b\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-dgflf" Apr 23 08:50:41.287444 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:41.287412 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dwzd\" (UniqueName: \"kubernetes.io/projected/825d2a5f-bf4f-4230-a808-8ac94f92aa8b-kube-api-access-8dwzd\") pod \"prometheus-operator-5676c8c784-dgflf\" (UID: \"825d2a5f-bf4f-4230-a808-8ac94f92aa8b\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-dgflf" Apr 23 08:50:41.517266 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:41.517188 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-sr575" Apr 23 08:50:41.783220 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:41.783138 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/825d2a5f-bf4f-4230-a808-8ac94f92aa8b-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-dgflf\" (UID: \"825d2a5f-bf4f-4230-a808-8ac94f92aa8b\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-dgflf" Apr 23 08:50:41.786029 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:41.786001 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/825d2a5f-bf4f-4230-a808-8ac94f92aa8b-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-dgflf\" (UID: \"825d2a5f-bf4f-4230-a808-8ac94f92aa8b\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-dgflf" Apr 23 08:50:42.035939 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:42.035855 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-dgflf" Apr 23 08:50:42.158869 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:42.158836 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-dgflf"] Apr 23 08:50:42.162012 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:50:42.161983 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod825d2a5f_bf4f_4230_a808_8ac94f92aa8b.slice/crio-9247010e64f4d7d842c533f3e166ea4153426b011aa0d2a97caf732369cb1dcb WatchSource:0}: Error finding container 9247010e64f4d7d842c533f3e166ea4153426b011aa0d2a97caf732369cb1dcb: Status 404 returned error can't find the container with id 9247010e64f4d7d842c533f3e166ea4153426b011aa0d2a97caf732369cb1dcb Apr 23 08:50:42.520771 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:42.520740 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-dgflf" event={"ID":"825d2a5f-bf4f-4230-a808-8ac94f92aa8b","Type":"ContainerStarted","Data":"9247010e64f4d7d842c533f3e166ea4153426b011aa0d2a97caf732369cb1dcb"} Apr 23 08:50:44.528092 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:44.528048 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-dgflf" event={"ID":"825d2a5f-bf4f-4230-a808-8ac94f92aa8b","Type":"ContainerStarted","Data":"39abf7c3ff00039256ff130c496e662d19b6cfe0f395c2df8e385b14e0f74719"} Apr 23 08:50:44.528092 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:44.528096 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-dgflf" event={"ID":"825d2a5f-bf4f-4230-a808-8ac94f92aa8b","Type":"ContainerStarted","Data":"2706f24d17a342178c1d9047cfdd141b4a457e03589d7b18b7f4835d286b25cc"} Apr 23 08:50:44.546787 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:44.546720 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-dgflf" podStartSLOduration=2.139727968 podStartE2EDuration="3.546700486s" podCreationTimestamp="2026-04-23 08:50:41 +0000 UTC" firstStartedPulling="2026-04-23 08:50:42.16402764 +0000 UTC m=+157.693507656" lastFinishedPulling="2026-04-23 08:50:43.571000153 +0000 UTC m=+159.100480174" observedRunningTime="2026-04-23 08:50:44.545272573 +0000 UTC m=+160.074752648" watchObservedRunningTime="2026-04-23 08:50:44.546700486 +0000 UTC m=+160.076180517" Apr 23 08:50:45.919221 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:45.919174 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/75c84a62-de6b-4301-9660-8d6ba1422a31-cert\") pod \"ingress-canary-dd2cl\" (UID: \"75c84a62-de6b-4301-9660-8d6ba1422a31\") " pod="openshift-ingress-canary/ingress-canary-dd2cl" Apr 23 08:50:45.919666 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:45.919280 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/533a84b7-1e94-4312-8393-c6c787af53b6-metrics-tls\") pod \"dns-default-sr575\" (UID: \"533a84b7-1e94-4312-8393-c6c787af53b6\") " pod="openshift-dns/dns-default-sr575" Apr 23 08:50:45.921885 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:45.921861 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/533a84b7-1e94-4312-8393-c6c787af53b6-metrics-tls\") pod \"dns-default-sr575\" (UID: \"533a84b7-1e94-4312-8393-c6c787af53b6\") " pod="openshift-dns/dns-default-sr575" Apr 23 08:50:45.922115 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:45.922094 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/75c84a62-de6b-4301-9660-8d6ba1422a31-cert\") pod \"ingress-canary-dd2cl\" (UID: \"75c84a62-de6b-4301-9660-8d6ba1422a31\") " pod="openshift-ingress-canary/ingress-canary-dd2cl" Apr 23 08:50:46.026580 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:46.026550 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-hskmq\"" Apr 23 08:50:46.028629 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:46.028607 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-sr575" Apr 23 08:50:46.557214 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:46.557185 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-n7qsk"] Apr 23 08:50:46.562002 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:46.561979 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-n7qsk" Apr 23 08:50:46.570620 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:46.570591 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-bp5x8\"" Apr 23 08:50:46.570858 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:46.570840 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 23 08:50:46.571115 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:46.571094 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 23 08:50:46.571380 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:46.571363 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 23 08:50:46.625852 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:46.625821 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rcp4\" (UniqueName: \"kubernetes.io/projected/ff66f732-0549-4282-a494-c8a8ade9825d-kube-api-access-2rcp4\") pod \"node-exporter-n7qsk\" (UID: \"ff66f732-0549-4282-a494-c8a8ade9825d\") " pod="openshift-monitoring/node-exporter-n7qsk" Apr 23 08:50:46.625852 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:46.625857 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/ff66f732-0549-4282-a494-c8a8ade9825d-node-exporter-textfile\") pod \"node-exporter-n7qsk\" (UID: \"ff66f732-0549-4282-a494-c8a8ade9825d\") " pod="openshift-monitoring/node-exporter-n7qsk" Apr 23 08:50:46.626052 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:46.625894 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/ff66f732-0549-4282-a494-c8a8ade9825d-node-exporter-tls\") pod \"node-exporter-n7qsk\" (UID: \"ff66f732-0549-4282-a494-c8a8ade9825d\") " pod="openshift-monitoring/node-exporter-n7qsk" Apr 23 08:50:46.626052 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:46.625920 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ff66f732-0549-4282-a494-c8a8ade9825d-metrics-client-ca\") pod \"node-exporter-n7qsk\" (UID: \"ff66f732-0549-4282-a494-c8a8ade9825d\") " pod="openshift-monitoring/node-exporter-n7qsk" Apr 23 08:50:46.626052 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:46.625950 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/ff66f732-0549-4282-a494-c8a8ade9825d-node-exporter-wtmp\") pod \"node-exporter-n7qsk\" (UID: \"ff66f732-0549-4282-a494-c8a8ade9825d\") " pod="openshift-monitoring/node-exporter-n7qsk" Apr 23 08:50:46.626052 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:46.625986 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ff66f732-0549-4282-a494-c8a8ade9825d-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-n7qsk\" (UID: \"ff66f732-0549-4282-a494-c8a8ade9825d\") " pod="openshift-monitoring/node-exporter-n7qsk" Apr 23 08:50:46.626289 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:46.626059 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ff66f732-0549-4282-a494-c8a8ade9825d-sys\") pod \"node-exporter-n7qsk\" (UID: \"ff66f732-0549-4282-a494-c8a8ade9825d\") " pod="openshift-monitoring/node-exporter-n7qsk" Apr 23 08:50:46.626289 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:46.626075 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/ff66f732-0549-4282-a494-c8a8ade9825d-node-exporter-accelerators-collector-config\") pod \"node-exporter-n7qsk\" (UID: \"ff66f732-0549-4282-a494-c8a8ade9825d\") " pod="openshift-monitoring/node-exporter-n7qsk" Apr 23 08:50:46.626289 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:46.626092 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/ff66f732-0549-4282-a494-c8a8ade9825d-root\") pod \"node-exporter-n7qsk\" (UID: \"ff66f732-0549-4282-a494-c8a8ade9825d\") " pod="openshift-monitoring/node-exporter-n7qsk" Apr 23 08:50:46.726894 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:46.726864 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/ff66f732-0549-4282-a494-c8a8ade9825d-root\") pod \"node-exporter-n7qsk\" (UID: \"ff66f732-0549-4282-a494-c8a8ade9825d\") " pod="openshift-monitoring/node-exporter-n7qsk" Apr 23 08:50:46.727071 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:46.726913 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2rcp4\" (UniqueName: \"kubernetes.io/projected/ff66f732-0549-4282-a494-c8a8ade9825d-kube-api-access-2rcp4\") pod \"node-exporter-n7qsk\" (UID: \"ff66f732-0549-4282-a494-c8a8ade9825d\") " pod="openshift-monitoring/node-exporter-n7qsk" Apr 23 08:50:46.727071 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:46.726983 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/ff66f732-0549-4282-a494-c8a8ade9825d-root\") pod \"node-exporter-n7qsk\" (UID: \"ff66f732-0549-4282-a494-c8a8ade9825d\") " pod="openshift-monitoring/node-exporter-n7qsk" Apr 23 08:50:46.727071 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:46.727036 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/ff66f732-0549-4282-a494-c8a8ade9825d-node-exporter-textfile\") pod \"node-exporter-n7qsk\" (UID: \"ff66f732-0549-4282-a494-c8a8ade9825d\") " pod="openshift-monitoring/node-exporter-n7qsk" Apr 23 08:50:46.727219 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:46.727096 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/ff66f732-0549-4282-a494-c8a8ade9825d-node-exporter-tls\") pod \"node-exporter-n7qsk\" (UID: \"ff66f732-0549-4282-a494-c8a8ade9825d\") " pod="openshift-monitoring/node-exporter-n7qsk" Apr 23 08:50:46.727219 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:46.727121 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ff66f732-0549-4282-a494-c8a8ade9825d-metrics-client-ca\") pod \"node-exporter-n7qsk\" (UID: \"ff66f732-0549-4282-a494-c8a8ade9825d\") " pod="openshift-monitoring/node-exporter-n7qsk" Apr 23 08:50:46.727219 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:46.727153 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/ff66f732-0549-4282-a494-c8a8ade9825d-node-exporter-wtmp\") pod \"node-exporter-n7qsk\" (UID: \"ff66f732-0549-4282-a494-c8a8ade9825d\") " pod="openshift-monitoring/node-exporter-n7qsk" Apr 23 08:50:46.727219 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:46.727187 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ff66f732-0549-4282-a494-c8a8ade9825d-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-n7qsk\" (UID: \"ff66f732-0549-4282-a494-c8a8ade9825d\") " pod="openshift-monitoring/node-exporter-n7qsk" Apr 23 08:50:46.727460 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:50:46.727216 2579 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 23 08:50:46.727460 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:46.727247 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ff66f732-0549-4282-a494-c8a8ade9825d-sys\") pod \"node-exporter-n7qsk\" (UID: \"ff66f732-0549-4282-a494-c8a8ade9825d\") " pod="openshift-monitoring/node-exporter-n7qsk" Apr 23 08:50:46.727460 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:50:46.727278 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff66f732-0549-4282-a494-c8a8ade9825d-node-exporter-tls podName:ff66f732-0549-4282-a494-c8a8ade9825d nodeName:}" failed. No retries permitted until 2026-04-23 08:50:47.227258142 +0000 UTC m=+162.756738163 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/ff66f732-0549-4282-a494-c8a8ade9825d-node-exporter-tls") pod "node-exporter-n7qsk" (UID: "ff66f732-0549-4282-a494-c8a8ade9825d") : secret "node-exporter-tls" not found Apr 23 08:50:46.727460 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:46.727295 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ff66f732-0549-4282-a494-c8a8ade9825d-sys\") pod \"node-exporter-n7qsk\" (UID: \"ff66f732-0549-4282-a494-c8a8ade9825d\") " pod="openshift-monitoring/node-exporter-n7qsk" Apr 23 08:50:46.727460 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:46.727304 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/ff66f732-0549-4282-a494-c8a8ade9825d-node-exporter-accelerators-collector-config\") pod \"node-exporter-n7qsk\" (UID: \"ff66f732-0549-4282-a494-c8a8ade9825d\") " pod="openshift-monitoring/node-exporter-n7qsk" Apr 23 08:50:46.727460 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:46.727355 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/ff66f732-0549-4282-a494-c8a8ade9825d-node-exporter-wtmp\") pod \"node-exporter-n7qsk\" (UID: \"ff66f732-0549-4282-a494-c8a8ade9825d\") " pod="openshift-monitoring/node-exporter-n7qsk" Apr 23 08:50:46.727756 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:46.727475 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/ff66f732-0549-4282-a494-c8a8ade9825d-node-exporter-textfile\") pod \"node-exporter-n7qsk\" (UID: \"ff66f732-0549-4282-a494-c8a8ade9825d\") " pod="openshift-monitoring/node-exporter-n7qsk" Apr 23 08:50:46.727812 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:46.727749 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/ff66f732-0549-4282-a494-c8a8ade9825d-node-exporter-accelerators-collector-config\") pod \"node-exporter-n7qsk\" (UID: \"ff66f732-0549-4282-a494-c8a8ade9825d\") " pod="openshift-monitoring/node-exporter-n7qsk" Apr 23 08:50:46.727864 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:46.727822 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ff66f732-0549-4282-a494-c8a8ade9825d-metrics-client-ca\") pod \"node-exporter-n7qsk\" (UID: \"ff66f732-0549-4282-a494-c8a8ade9825d\") " pod="openshift-monitoring/node-exporter-n7qsk" Apr 23 08:50:46.730065 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:46.730029 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ff66f732-0549-4282-a494-c8a8ade9825d-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-n7qsk\" (UID: \"ff66f732-0549-4282-a494-c8a8ade9825d\") " pod="openshift-monitoring/node-exporter-n7qsk" Apr 23 08:50:46.768556 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:46.768528 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rcp4\" (UniqueName: \"kubernetes.io/projected/ff66f732-0549-4282-a494-c8a8ade9825d-kube-api-access-2rcp4\") pod \"node-exporter-n7qsk\" (UID: \"ff66f732-0549-4282-a494-c8a8ade9825d\") " pod="openshift-monitoring/node-exporter-n7qsk" Apr 23 08:50:47.232190 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:47.232152 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/ff66f732-0549-4282-a494-c8a8ade9825d-node-exporter-tls\") pod \"node-exporter-n7qsk\" (UID: \"ff66f732-0549-4282-a494-c8a8ade9825d\") " pod="openshift-monitoring/node-exporter-n7qsk" Apr 23 08:50:47.234775 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:47.234751 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/ff66f732-0549-4282-a494-c8a8ade9825d-node-exporter-tls\") pod \"node-exporter-n7qsk\" (UID: \"ff66f732-0549-4282-a494-c8a8ade9825d\") " pod="openshift-monitoring/node-exporter-n7qsk" Apr 23 08:50:47.473640 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:47.473608 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-n7qsk" Apr 23 08:50:51.210750 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:50:51.210709 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff66f732_0549_4282_a494_c8a8ade9825d.slice/crio-6637f4243525bb897a97da16a93c67a07f00800a466a68e01b70500f2bb74bf8 WatchSource:0}: Error finding container 6637f4243525bb897a97da16a93c67a07f00800a466a68e01b70500f2bb74bf8: Status 404 returned error can't find the container with id 6637f4243525bb897a97da16a93c67a07f00800a466a68e01b70500f2bb74bf8 Apr 23 08:50:51.310450 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:51.310430 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-9sm58"] Apr 23 08:50:51.313222 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:51.313201 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-9sm58" Apr 23 08:50:51.316895 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:51.316868 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 23 08:50:51.317007 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:51.316914 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-kh9z9\"" Apr 23 08:50:51.329578 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:51.329553 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-9sm58"] Apr 23 08:50:51.342912 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:51.342855 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-sr575"] Apr 23 08:50:51.344685 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:50:51.344656 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod533a84b7_1e94_4312_8393_c6c787af53b6.slice/crio-2b9ded3670e642e62aac27dab28cd30774d7ab59631ac572cb39375e2ee06e63 WatchSource:0}: Error finding container 2b9ded3670e642e62aac27dab28cd30774d7ab59631ac572cb39375e2ee06e63: Status 404 returned error can't find the container with id 2b9ded3670e642e62aac27dab28cd30774d7ab59631ac572cb39375e2ee06e63 Apr 23 08:50:51.468279 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:51.468209 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ec1f46f2-d7b6-4592-b515-562c3385bb81-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-9sm58\" (UID: \"ec1f46f2-d7b6-4592-b515-562c3385bb81\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-9sm58" Apr 23 08:50:51.547994 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:51.547945 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-n7qsk" event={"ID":"ff66f732-0549-4282-a494-c8a8ade9825d","Type":"ContainerStarted","Data":"6637f4243525bb897a97da16a93c67a07f00800a466a68e01b70500f2bb74bf8"} Apr 23 08:50:51.549491 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:51.549457 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-4f5nb" event={"ID":"6da55603-9712-4719-b29b-d9feec7ef27f","Type":"ContainerStarted","Data":"774256b4e7777efc10a55ad9df39c67ee041b6a49a1c3c4985a5707c80935254"} Apr 23 08:50:51.549861 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:51.549830 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-4f5nb" Apr 23 08:50:51.551156 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:51.551116 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-sr575" event={"ID":"533a84b7-1e94-4312-8393-c6c787af53b6","Type":"ContainerStarted","Data":"2b9ded3670e642e62aac27dab28cd30774d7ab59631ac572cb39375e2ee06e63"} Apr 23 08:50:51.561689 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:51.561657 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-4f5nb" Apr 23 08:50:51.569054 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:51.569030 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ec1f46f2-d7b6-4592-b515-562c3385bb81-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-9sm58\" (UID: \"ec1f46f2-d7b6-4592-b515-562c3385bb81\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-9sm58" Apr 23 08:50:51.569326 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:50:51.569182 2579 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 23 08:50:51.569326 ip-10-0-131-47 kubenswrapper[2579]: E0423 08:50:51.569258 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ec1f46f2-d7b6-4592-b515-562c3385bb81-monitoring-plugin-cert podName:ec1f46f2-d7b6-4592-b515-562c3385bb81 nodeName:}" failed. No retries permitted until 2026-04-23 08:50:52.069243044 +0000 UTC m=+167.598723049 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/ec1f46f2-d7b6-4592-b515-562c3385bb81-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-9sm58" (UID: "ec1f46f2-d7b6-4592-b515-562c3385bb81") : secret "monitoring-plugin-cert" not found Apr 23 08:50:51.571401 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:51.571362 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-4f5nb" podStartSLOduration=1.699625589 podStartE2EDuration="18.571326914s" podCreationTimestamp="2026-04-23 08:50:33 +0000 UTC" firstStartedPulling="2026-04-23 08:50:34.432820061 +0000 UTC m=+149.962300066" lastFinishedPulling="2026-04-23 08:50:51.304521384 +0000 UTC m=+166.834001391" observedRunningTime="2026-04-23 08:50:51.570037819 +0000 UTC m=+167.099517846" watchObservedRunningTime="2026-04-23 08:50:51.571326914 +0000 UTC m=+167.100806940" Apr 23 08:50:52.011376 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:52.011326 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-96m6d" Apr 23 08:50:52.074632 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:52.074596 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ec1f46f2-d7b6-4592-b515-562c3385bb81-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-9sm58\" (UID: \"ec1f46f2-d7b6-4592-b515-562c3385bb81\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-9sm58" Apr 23 08:50:52.077404 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:52.077378 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ec1f46f2-d7b6-4592-b515-562c3385bb81-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-9sm58\" (UID: \"ec1f46f2-d7b6-4592-b515-562c3385bb81\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-9sm58" Apr 23 08:50:52.228144 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:52.228098 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-9sm58" Apr 23 08:50:52.395190 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:52.395156 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-9sm58"] Apr 23 08:50:52.396993 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:50:52.396951 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec1f46f2_d7b6_4592_b515_562c3385bb81.slice/crio-e78315640775abdc82de399f173d0f53c4d30dc2adf4de724d54a86da4680208 WatchSource:0}: Error finding container e78315640775abdc82de399f173d0f53c4d30dc2adf4de724d54a86da4680208: Status 404 returned error can't find the container with id e78315640775abdc82de399f173d0f53c4d30dc2adf4de724d54a86da4680208 Apr 23 08:50:52.559513 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:52.559397 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-n7qsk" event={"ID":"ff66f732-0549-4282-a494-c8a8ade9825d","Type":"ContainerDied","Data":"c0e69f55318f871a7bbf38fd415bfc112da76ddbb06174230b767b714091a3ed"} Apr 23 08:50:52.559513 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:52.559331 2579 generic.go:358] "Generic (PLEG): container finished" podID="ff66f732-0549-4282-a494-c8a8ade9825d" containerID="c0e69f55318f871a7bbf38fd415bfc112da76ddbb06174230b767b714091a3ed" exitCode=0 Apr 23 08:50:52.561654 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:52.561606 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-9sm58" event={"ID":"ec1f46f2-d7b6-4592-b515-562c3385bb81","Type":"ContainerStarted","Data":"e78315640775abdc82de399f173d0f53c4d30dc2adf4de724d54a86da4680208"} Apr 23 08:50:52.863983 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:52.863900 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 08:50:52.868637 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:52.868609 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:50:52.871230 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:52.871204 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 23 08:50:52.872052 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:52.871668 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-9w4qp\"" Apr 23 08:50:52.873454 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:52.872954 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 23 08:50:52.873454 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:52.873055 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 23 08:50:52.873454 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:52.873077 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 23 08:50:52.873454 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:52.872960 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 23 08:50:52.873454 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:52.873283 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 23 08:50:52.873454 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:52.873329 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 23 08:50:52.873454 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:52.873362 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 23 08:50:52.873454 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:52.873288 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 23 08:50:52.873454 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:52.873288 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 23 08:50:52.873951 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:52.873652 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 23 08:50:52.873951 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:52.873690 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-4ai01su1fum9p\"" Apr 23 08:50:52.878558 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:52.877160 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 23 08:50:52.882431 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:52.881652 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 08:50:52.883818 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:52.883694 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 23 08:50:52.983899 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:52.983852 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2b8a74f2-1318-418e-9416-b9da2d995059-config\") pod \"prometheus-k8s-0\" (UID: \"2b8a74f2-1318-418e-9416-b9da2d995059\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:50:52.984093 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:52.983913 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2b8a74f2-1318-418e-9416-b9da2d995059-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2b8a74f2-1318-418e-9416-b9da2d995059\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:50:52.984093 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:52.983943 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2b8a74f2-1318-418e-9416-b9da2d995059-web-config\") pod \"prometheus-k8s-0\" (UID: \"2b8a74f2-1318-418e-9416-b9da2d995059\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:50:52.984093 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:52.983966 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/2b8a74f2-1318-418e-9416-b9da2d995059-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"2b8a74f2-1318-418e-9416-b9da2d995059\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:50:52.984093 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:52.983999 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2b8a74f2-1318-418e-9416-b9da2d995059-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2b8a74f2-1318-418e-9416-b9da2d995059\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:50:52.984093 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:52.984016 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/2b8a74f2-1318-418e-9416-b9da2d995059-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"2b8a74f2-1318-418e-9416-b9da2d995059\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:50:52.984093 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:52.984046 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/2b8a74f2-1318-418e-9416-b9da2d995059-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"2b8a74f2-1318-418e-9416-b9da2d995059\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:50:52.984093 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:52.984073 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/2b8a74f2-1318-418e-9416-b9da2d995059-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"2b8a74f2-1318-418e-9416-b9da2d995059\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:50:52.984093 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:52.984095 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/2b8a74f2-1318-418e-9416-b9da2d995059-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"2b8a74f2-1318-418e-9416-b9da2d995059\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:50:52.984543 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:52.984126 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/2b8a74f2-1318-418e-9416-b9da2d995059-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"2b8a74f2-1318-418e-9416-b9da2d995059\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:50:52.984543 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:52.984149 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/2b8a74f2-1318-418e-9416-b9da2d995059-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"2b8a74f2-1318-418e-9416-b9da2d995059\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:50:52.984543 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:52.984190 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2b8a74f2-1318-418e-9416-b9da2d995059-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"2b8a74f2-1318-418e-9416-b9da2d995059\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:50:52.984543 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:52.984213 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2b8a74f2-1318-418e-9416-b9da2d995059-config-out\") pod \"prometheus-k8s-0\" (UID: \"2b8a74f2-1318-418e-9416-b9da2d995059\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:50:52.984543 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:52.984235 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2b8a74f2-1318-418e-9416-b9da2d995059-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"2b8a74f2-1318-418e-9416-b9da2d995059\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:50:52.984543 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:52.984258 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/2b8a74f2-1318-418e-9416-b9da2d995059-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"2b8a74f2-1318-418e-9416-b9da2d995059\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:50:52.984543 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:52.984287 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2b8a74f2-1318-418e-9416-b9da2d995059-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2b8a74f2-1318-418e-9416-b9da2d995059\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:50:52.984543 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:52.984354 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dk8n4\" (UniqueName: \"kubernetes.io/projected/2b8a74f2-1318-418e-9416-b9da2d995059-kube-api-access-dk8n4\") pod \"prometheus-k8s-0\" (UID: \"2b8a74f2-1318-418e-9416-b9da2d995059\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:50:52.984543 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:52.984395 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/2b8a74f2-1318-418e-9416-b9da2d995059-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"2b8a74f2-1318-418e-9416-b9da2d995059\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:50:53.085571 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:53.084818 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2b8a74f2-1318-418e-9416-b9da2d995059-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2b8a74f2-1318-418e-9416-b9da2d995059\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:50:53.085571 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:53.084863 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/2b8a74f2-1318-418e-9416-b9da2d995059-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"2b8a74f2-1318-418e-9416-b9da2d995059\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:50:53.085571 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:53.084898 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/2b8a74f2-1318-418e-9416-b9da2d995059-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"2b8a74f2-1318-418e-9416-b9da2d995059\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:50:53.085571 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:53.084923 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/2b8a74f2-1318-418e-9416-b9da2d995059-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"2b8a74f2-1318-418e-9416-b9da2d995059\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:50:53.085571 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:53.084948 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/2b8a74f2-1318-418e-9416-b9da2d995059-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"2b8a74f2-1318-418e-9416-b9da2d995059\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:50:53.085571 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:53.084982 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/2b8a74f2-1318-418e-9416-b9da2d995059-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"2b8a74f2-1318-418e-9416-b9da2d995059\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:50:53.085571 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:53.085007 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/2b8a74f2-1318-418e-9416-b9da2d995059-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"2b8a74f2-1318-418e-9416-b9da2d995059\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:50:53.085571 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:53.085055 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2b8a74f2-1318-418e-9416-b9da2d995059-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"2b8a74f2-1318-418e-9416-b9da2d995059\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:50:53.085571 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:53.085078 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2b8a74f2-1318-418e-9416-b9da2d995059-config-out\") pod \"prometheus-k8s-0\" (UID: \"2b8a74f2-1318-418e-9416-b9da2d995059\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:50:53.085571 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:53.085103 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2b8a74f2-1318-418e-9416-b9da2d995059-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"2b8a74f2-1318-418e-9416-b9da2d995059\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:50:53.085571 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:53.085130 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/2b8a74f2-1318-418e-9416-b9da2d995059-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"2b8a74f2-1318-418e-9416-b9da2d995059\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:50:53.085571 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:53.085154 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2b8a74f2-1318-418e-9416-b9da2d995059-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2b8a74f2-1318-418e-9416-b9da2d995059\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:50:53.085571 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:53.085202 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dk8n4\" (UniqueName: \"kubernetes.io/projected/2b8a74f2-1318-418e-9416-b9da2d995059-kube-api-access-dk8n4\") pod \"prometheus-k8s-0\" (UID: \"2b8a74f2-1318-418e-9416-b9da2d995059\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:50:53.085571 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:53.085246 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/2b8a74f2-1318-418e-9416-b9da2d995059-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"2b8a74f2-1318-418e-9416-b9da2d995059\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:50:53.085571 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:53.085277 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2b8a74f2-1318-418e-9416-b9da2d995059-config\") pod \"prometheus-k8s-0\" (UID: \"2b8a74f2-1318-418e-9416-b9da2d995059\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:50:53.085571 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:53.085315 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2b8a74f2-1318-418e-9416-b9da2d995059-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2b8a74f2-1318-418e-9416-b9da2d995059\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:50:53.085571 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:53.085366 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2b8a74f2-1318-418e-9416-b9da2d995059-web-config\") pod \"prometheus-k8s-0\" (UID: \"2b8a74f2-1318-418e-9416-b9da2d995059\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:50:53.086652 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:53.085389 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/2b8a74f2-1318-418e-9416-b9da2d995059-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"2b8a74f2-1318-418e-9416-b9da2d995059\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:50:53.089739 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:53.087617 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/2b8a74f2-1318-418e-9416-b9da2d995059-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"2b8a74f2-1318-418e-9416-b9da2d995059\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:50:53.089739 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:53.087824 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2b8a74f2-1318-418e-9416-b9da2d995059-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2b8a74f2-1318-418e-9416-b9da2d995059\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:50:53.089739 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:53.088573 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2b8a74f2-1318-418e-9416-b9da2d995059-config-out\") pod \"prometheus-k8s-0\" (UID: \"2b8a74f2-1318-418e-9416-b9da2d995059\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:50:53.089739 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:53.088969 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2b8a74f2-1318-418e-9416-b9da2d995059-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2b8a74f2-1318-418e-9416-b9da2d995059\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:50:53.089739 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:53.089121 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2b8a74f2-1318-418e-9416-b9da2d995059-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"2b8a74f2-1318-418e-9416-b9da2d995059\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:50:53.090084 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:53.089842 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2b8a74f2-1318-418e-9416-b9da2d995059-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2b8a74f2-1318-418e-9416-b9da2d995059\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:50:53.090259 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:53.090240 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/2b8a74f2-1318-418e-9416-b9da2d995059-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"2b8a74f2-1318-418e-9416-b9da2d995059\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:50:53.093311 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:53.093287 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/2b8a74f2-1318-418e-9416-b9da2d995059-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"2b8a74f2-1318-418e-9416-b9da2d995059\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:50:53.096380 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:53.095888 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/2b8a74f2-1318-418e-9416-b9da2d995059-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"2b8a74f2-1318-418e-9416-b9da2d995059\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:50:53.096380 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:53.096236 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2b8a74f2-1318-418e-9416-b9da2d995059-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"2b8a74f2-1318-418e-9416-b9da2d995059\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:50:53.096866 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:53.096649 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/2b8a74f2-1318-418e-9416-b9da2d995059-config\") pod \"prometheus-k8s-0\" (UID: \"2b8a74f2-1318-418e-9416-b9da2d995059\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:50:53.097029 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:53.096914 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2b8a74f2-1318-418e-9416-b9da2d995059-web-config\") pod \"prometheus-k8s-0\" (UID: \"2b8a74f2-1318-418e-9416-b9da2d995059\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:50:53.097185 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:53.097110 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/2b8a74f2-1318-418e-9416-b9da2d995059-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"2b8a74f2-1318-418e-9416-b9da2d995059\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:50:53.097674 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:53.097648 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/2b8a74f2-1318-418e-9416-b9da2d995059-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"2b8a74f2-1318-418e-9416-b9da2d995059\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:50:53.097766 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:53.097612 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/2b8a74f2-1318-418e-9416-b9da2d995059-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"2b8a74f2-1318-418e-9416-b9da2d995059\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:50:53.098591 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:53.098565 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/2b8a74f2-1318-418e-9416-b9da2d995059-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"2b8a74f2-1318-418e-9416-b9da2d995059\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:50:53.099422 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:53.099402 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dk8n4\" (UniqueName: \"kubernetes.io/projected/2b8a74f2-1318-418e-9416-b9da2d995059-kube-api-access-dk8n4\") pod \"prometheus-k8s-0\" (UID: \"2b8a74f2-1318-418e-9416-b9da2d995059\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:50:53.100806 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:53.100783 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/2b8a74f2-1318-418e-9416-b9da2d995059-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"2b8a74f2-1318-418e-9416-b9da2d995059\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:50:53.189712 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:53.189675 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:50:53.558015 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:53.557977 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 08:50:53.574989 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:50:53.574957 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b8a74f2_1318_418e_9416_b9da2d995059.slice/crio-eeab2d7317a363842b5bf38e3e47259ba95e9b354bf07bad53f19c3dd6c8595e WatchSource:0}: Error finding container eeab2d7317a363842b5bf38e3e47259ba95e9b354bf07bad53f19c3dd6c8595e: Status 404 returned error can't find the container with id eeab2d7317a363842b5bf38e3e47259ba95e9b354bf07bad53f19c3dd6c8595e Apr 23 08:50:53.576398 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:53.576362 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-n7qsk" event={"ID":"ff66f732-0549-4282-a494-c8a8ade9825d","Type":"ContainerStarted","Data":"c3456fee4267ce73c34eeb868d13c311dd3fc7c62db18f6e22a2e74d272036d1"} Apr 23 08:50:53.576486 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:53.576413 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-n7qsk" event={"ID":"ff66f732-0549-4282-a494-c8a8ade9825d","Type":"ContainerStarted","Data":"03403de1b601baf96917fe4b3e3883c05936a789a73d29995a88780ce7b0c9b6"} Apr 23 08:50:53.643001 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:53.641951 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-n7qsk" podStartSLOduration=6.902021121 podStartE2EDuration="7.641930415s" podCreationTimestamp="2026-04-23 08:50:46 +0000 UTC" firstStartedPulling="2026-04-23 08:50:51.212806791 +0000 UTC m=+166.742286811" lastFinishedPulling="2026-04-23 08:50:51.952716087 +0000 UTC m=+167.482196105" observedRunningTime="2026-04-23 08:50:53.641749199 +0000 UTC m=+169.171229227" watchObservedRunningTime="2026-04-23 08:50:53.641930415 +0000 UTC m=+169.171410444" Apr 23 08:50:54.012568 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:54.012479 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-dd2cl" Apr 23 08:50:54.015324 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:54.015288 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-55hsm\"" Apr 23 08:50:54.023520 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:54.023397 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-dd2cl" Apr 23 08:50:54.307274 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:54.307143 2579 patch_prober.go:28] interesting pod/image-registry-6d4d878cc9-fkk7r container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 08:50:54.307274 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:54.307207 2579 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-6d4d878cc9-fkk7r" podUID="3dc2497b-5b53-4e93-ab87-8509553bef5d" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 08:50:54.581805 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:54.581723 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-sr575" event={"ID":"533a84b7-1e94-4312-8393-c6c787af53b6","Type":"ContainerStarted","Data":"cb137bd081ba237b633de20f157f5f87b32734e26fbf5988d52ba34cfeec1270"} Apr 23 08:50:54.581805 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:54.581767 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-sr575" event={"ID":"533a84b7-1e94-4312-8393-c6c787af53b6","Type":"ContainerStarted","Data":"9278e318baa949a8275a954be139d626f901aa93f0650b974fdb300d5c051ef5"} Apr 23 08:50:54.582240 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:54.581994 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-sr575" Apr 23 08:50:54.583017 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:54.582986 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2b8a74f2-1318-418e-9416-b9da2d995059","Type":"ContainerStarted","Data":"eeab2d7317a363842b5bf38e3e47259ba95e9b354bf07bad53f19c3dd6c8595e"} Apr 23 08:50:54.601354 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:54.601280 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-sr575" podStartSLOduration=135.554615126 podStartE2EDuration="2m17.601267342s" podCreationTimestamp="2026-04-23 08:48:37 +0000 UTC" firstStartedPulling="2026-04-23 08:50:51.346300236 +0000 UTC m=+166.875780239" lastFinishedPulling="2026-04-23 08:50:53.392952445 +0000 UTC m=+168.922432455" observedRunningTime="2026-04-23 08:50:54.599516042 +0000 UTC m=+170.128996069" watchObservedRunningTime="2026-04-23 08:50:54.601267342 +0000 UTC m=+170.130747370" Apr 23 08:50:54.752518 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:54.752427 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-dd2cl"] Apr 23 08:50:55.144371 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:50:55.144120 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75c84a62_de6b_4301_9660_8d6ba1422a31.slice/crio-f1f5ef345619f5afa32f3f74c06a193fce81fd4e39448c44d97dcd9a7a21ecf3 WatchSource:0}: Error finding container f1f5ef345619f5afa32f3f74c06a193fce81fd4e39448c44d97dcd9a7a21ecf3: Status 404 returned error can't find the container with id f1f5ef345619f5afa32f3f74c06a193fce81fd4e39448c44d97dcd9a7a21ecf3 Apr 23 08:50:55.587575 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:55.587325 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-9sm58" event={"ID":"ec1f46f2-d7b6-4592-b515-562c3385bb81","Type":"ContainerStarted","Data":"598457f2489c9bdf8b5ca8ef890afbb570a1fdf6063f86a3a8ae3da3d462094a"} Apr 23 08:50:55.587575 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:55.587404 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-9sm58" Apr 23 08:50:55.589259 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:55.589220 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2b8a74f2-1318-418e-9416-b9da2d995059","Type":"ContainerStarted","Data":"e2d8c11e04ef695ea047f97bfbb424d0932f0438c825fe2f1d9047ed4aa55692"} Apr 23 08:50:55.590862 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:55.590813 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-dd2cl" event={"ID":"75c84a62-de6b-4301-9660-8d6ba1422a31","Type":"ContainerStarted","Data":"f1f5ef345619f5afa32f3f74c06a193fce81fd4e39448c44d97dcd9a7a21ecf3"} Apr 23 08:50:55.593103 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:55.593084 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-9sm58" Apr 23 08:50:55.623947 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:55.623897 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-9sm58" podStartSLOduration=1.608653898 podStartE2EDuration="4.623881982s" podCreationTimestamp="2026-04-23 08:50:51 +0000 UTC" firstStartedPulling="2026-04-23 08:50:52.400150228 +0000 UTC m=+167.929630237" lastFinishedPulling="2026-04-23 08:50:55.415378311 +0000 UTC m=+170.944858321" observedRunningTime="2026-04-23 08:50:55.606606466 +0000 UTC m=+171.136086493" watchObservedRunningTime="2026-04-23 08:50:55.623881982 +0000 UTC m=+171.153362012" Apr 23 08:50:56.508251 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:56.508180 2579 patch_prober.go:28] interesting pod/image-registry-6d4d878cc9-fkk7r container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 08:50:56.508623 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:56.508587 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-6d4d878cc9-fkk7r" podUID="3dc2497b-5b53-4e93-ab87-8509553bef5d" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 08:50:56.597477 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:56.597435 2579 generic.go:358] "Generic (PLEG): container finished" podID="2b8a74f2-1318-418e-9416-b9da2d995059" containerID="e2d8c11e04ef695ea047f97bfbb424d0932f0438c825fe2f1d9047ed4aa55692" exitCode=0 Apr 23 08:50:56.597915 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:56.597534 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2b8a74f2-1318-418e-9416-b9da2d995059","Type":"ContainerDied","Data":"e2d8c11e04ef695ea047f97bfbb424d0932f0438c825fe2f1d9047ed4aa55692"} Apr 23 08:50:58.607108 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:58.607069 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-dd2cl" event={"ID":"75c84a62-de6b-4301-9660-8d6ba1422a31","Type":"ContainerStarted","Data":"e267e435267c9868b34aecb57ac21154c00d606ea8e7b641dd15dde022722b21"} Apr 23 08:50:58.622888 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:50:58.622836 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-dd2cl" podStartSLOduration=139.18587062 podStartE2EDuration="2m21.622821734s" podCreationTimestamp="2026-04-23 08:48:37 +0000 UTC" firstStartedPulling="2026-04-23 08:50:55.413865334 +0000 UTC m=+170.943345353" lastFinishedPulling="2026-04-23 08:50:57.85081645 +0000 UTC m=+173.380296467" observedRunningTime="2026-04-23 08:50:58.622066478 +0000 UTC m=+174.151546505" watchObservedRunningTime="2026-04-23 08:50:58.622821734 +0000 UTC m=+174.152301760" Apr 23 08:51:01.620221 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:51:01.620131 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2b8a74f2-1318-418e-9416-b9da2d995059","Type":"ContainerStarted","Data":"8948a6d6742a27b44cf7208a818c804a3bda73807082d7e9a6942cfef64e8d3e"} Apr 23 08:51:01.620221 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:51:01.620177 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2b8a74f2-1318-418e-9416-b9da2d995059","Type":"ContainerStarted","Data":"be65e6b91fcc8356c739d8d544b66e7328904b3153ce5e1905ed052f5d34574a"} Apr 23 08:51:04.305772 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:51:04.305743 2579 patch_prober.go:28] interesting pod/image-registry-6d4d878cc9-fkk7r container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 08:51:04.306120 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:51:04.305792 2579 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-6d4d878cc9-fkk7r" podUID="3dc2497b-5b53-4e93-ab87-8509553bef5d" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 08:51:04.593802 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:51:04.593726 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-sr575" Apr 23 08:51:04.632909 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:51:04.632880 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2b8a74f2-1318-418e-9416-b9da2d995059","Type":"ContainerStarted","Data":"db5e2ba6249026b09eab19a954b73ee80e8716df14051bec8b3bd92fb2a2cb59"} Apr 23 08:51:04.632909 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:51:04.632911 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2b8a74f2-1318-418e-9416-b9da2d995059","Type":"ContainerStarted","Data":"53491749254c571de60b24a204dad38f40603def7888693f3f962e5da4a6dbf2"} Apr 23 08:51:04.632909 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:51:04.632920 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2b8a74f2-1318-418e-9416-b9da2d995059","Type":"ContainerStarted","Data":"bb56bc3cfc2db3dbd6a4700e2cbe14f887c8381f702254a39df514445e7a6efe"} Apr 23 08:51:04.633132 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:51:04.632929 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2b8a74f2-1318-418e-9416-b9da2d995059","Type":"ContainerStarted","Data":"64359d12d21314e7c9d406031557a433bd375e1c41839c4d27d6203fd0234dba"} Apr 23 08:51:04.665945 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:51:04.665891 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.683234188 podStartE2EDuration="12.665869724s" podCreationTimestamp="2026-04-23 08:50:52 +0000 UTC" firstStartedPulling="2026-04-23 08:50:53.577573109 +0000 UTC m=+169.107053117" lastFinishedPulling="2026-04-23 08:51:03.560208645 +0000 UTC m=+179.089688653" observedRunningTime="2026-04-23 08:51:04.664435654 +0000 UTC m=+180.193915680" watchObservedRunningTime="2026-04-23 08:51:04.665869724 +0000 UTC m=+180.195349753" Apr 23 08:51:06.505589 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:51:06.505558 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-6d4d878cc9-fkk7r" Apr 23 08:51:08.190831 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:51:08.190794 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:51:53.191254 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:51:53.191220 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:51:53.207417 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:51:53.207389 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:51:53.780392 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:51:53.780324 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 08:52:15.892360 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:52:15.892306 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f3e78a7f-f7cc-48d6-a29a-7d418c195aef-metrics-certs\") pod \"network-metrics-daemon-96m6d\" (UID: \"f3e78a7f-f7cc-48d6-a29a-7d418c195aef\") " pod="openshift-multus/network-metrics-daemon-96m6d" Apr 23 08:52:15.894742 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:52:15.894723 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f3e78a7f-f7cc-48d6-a29a-7d418c195aef-metrics-certs\") pod \"network-metrics-daemon-96m6d\" (UID: \"f3e78a7f-f7cc-48d6-a29a-7d418c195aef\") " pod="openshift-multus/network-metrics-daemon-96m6d" Apr 23 08:52:16.014503 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:52:16.014480 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-zffjs\"" Apr 23 08:52:16.022243 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:52:16.022224 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-96m6d" Apr 23 08:52:16.138768 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:52:16.138746 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-96m6d"] Apr 23 08:52:16.141468 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:52:16.141440 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf3e78a7f_f7cc_48d6_a29a_7d418c195aef.slice/crio-74d822c5d06033c75ec726a31d32d6c3151f94a93b65ce7ff3d781f20cd18d6e WatchSource:0}: Error finding container 74d822c5d06033c75ec726a31d32d6c3151f94a93b65ce7ff3d781f20cd18d6e: Status 404 returned error can't find the container with id 74d822c5d06033c75ec726a31d32d6c3151f94a93b65ce7ff3d781f20cd18d6e Apr 23 08:52:16.827508 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:52:16.827472 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-96m6d" event={"ID":"f3e78a7f-f7cc-48d6-a29a-7d418c195aef","Type":"ContainerStarted","Data":"74d822c5d06033c75ec726a31d32d6c3151f94a93b65ce7ff3d781f20cd18d6e"} Apr 23 08:52:18.834409 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:52:18.834378 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-96m6d" event={"ID":"f3e78a7f-f7cc-48d6-a29a-7d418c195aef","Type":"ContainerStarted","Data":"bc64e0631e81bfa17d10f18017e8b51874ba2260ec5954cad12d0b0a91945489"} Apr 23 08:52:19.844260 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:52:19.844220 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-96m6d" event={"ID":"f3e78a7f-f7cc-48d6-a29a-7d418c195aef","Type":"ContainerStarted","Data":"d07236c55186a86fabaa983024417c3cf980be6daa7f9e4e59352e3d790172ac"} Apr 23 08:52:19.865902 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:52:19.865852 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-96m6d" podStartSLOduration=252.396585267 podStartE2EDuration="4m14.865834207s" podCreationTimestamp="2026-04-23 08:48:05 +0000 UTC" firstStartedPulling="2026-04-23 08:52:16.145628046 +0000 UTC m=+251.675108049" lastFinishedPulling="2026-04-23 08:52:18.614876978 +0000 UTC m=+254.144356989" observedRunningTime="2026-04-23 08:52:19.864849463 +0000 UTC m=+255.394329489" watchObservedRunningTime="2026-04-23 08:52:19.865834207 +0000 UTC m=+255.395314233" Apr 23 08:53:04.898649 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:53:04.898596 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7ftql_8e2935cf-4651-42ac-bd9e-54accc810a7a/console-operator/1.log" Apr 23 08:53:04.899278 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:53:04.898752 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7ftql_8e2935cf-4651-42ac-bd9e-54accc810a7a/console-operator/1.log" Apr 23 08:53:04.901658 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:53:04.901634 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9bsnx_33cb4bff-3a7e-42a3-8d3d-e1c79d36437b/ovn-acl-logging/0.log" Apr 23 08:53:04.901894 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:53:04.901876 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9bsnx_33cb4bff-3a7e-42a3-8d3d-e1c79d36437b/ovn-acl-logging/0.log" Apr 23 08:53:04.908125 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:53:04.908109 2579 kubelet.go:1628] "Image garbage collection succeeded" Apr 23 08:57:26.297699 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:57:26.297668 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["test-ns-hzkbk/test-trainjob-c268h-node-0-0-c7gzd"] Apr 23 08:57:26.301018 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:57:26.300996 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-hzkbk/test-trainjob-c268h-node-0-0-c7gzd" Apr 23 08:57:26.304911 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:57:26.304893 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-hzkbk\"/\"kube-root-ca.crt\"" Apr 23 08:57:26.305026 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:57:26.305006 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-hzkbk\"/\"default-dockercfg-85pnh\"" Apr 23 08:57:26.305794 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:57:26.305777 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-hzkbk\"/\"openshift-service-ca.crt\"" Apr 23 08:57:26.319610 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:57:26.319582 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-hzkbk/test-trainjob-c268h-node-0-0-c7gzd"] Apr 23 08:57:26.353185 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:57:26.353156 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tppzc\" (UniqueName: \"kubernetes.io/projected/01609374-82db-43aa-a193-a88da0700b70-kube-api-access-tppzc\") pod \"test-trainjob-c268h-node-0-0-c7gzd\" (UID: \"01609374-82db-43aa-a193-a88da0700b70\") " pod="test-ns-hzkbk/test-trainjob-c268h-node-0-0-c7gzd" Apr 23 08:57:26.453811 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:57:26.453786 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tppzc\" (UniqueName: \"kubernetes.io/projected/01609374-82db-43aa-a193-a88da0700b70-kube-api-access-tppzc\") pod \"test-trainjob-c268h-node-0-0-c7gzd\" (UID: \"01609374-82db-43aa-a193-a88da0700b70\") " pod="test-ns-hzkbk/test-trainjob-c268h-node-0-0-c7gzd" Apr 23 08:57:26.463699 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:57:26.463676 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tppzc\" (UniqueName: \"kubernetes.io/projected/01609374-82db-43aa-a193-a88da0700b70-kube-api-access-tppzc\") pod \"test-trainjob-c268h-node-0-0-c7gzd\" (UID: \"01609374-82db-43aa-a193-a88da0700b70\") " pod="test-ns-hzkbk/test-trainjob-c268h-node-0-0-c7gzd" Apr 23 08:57:26.620971 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:57:26.620909 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-hzkbk/test-trainjob-c268h-node-0-0-c7gzd" Apr 23 08:57:26.736384 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:57:26.736359 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-hzkbk/test-trainjob-c268h-node-0-0-c7gzd"] Apr 23 08:57:26.738326 ip-10-0-131-47 kubenswrapper[2579]: W0423 08:57:26.738294 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod01609374_82db_43aa_a193_a88da0700b70.slice/crio-4dac38191cdc2dd1b4eabbe61bd35c45e2d6d6a699ff05d1f4467d9aef57a7c8 WatchSource:0}: Error finding container 4dac38191cdc2dd1b4eabbe61bd35c45e2d6d6a699ff05d1f4467d9aef57a7c8: Status 404 returned error can't find the container with id 4dac38191cdc2dd1b4eabbe61bd35c45e2d6d6a699ff05d1f4467d9aef57a7c8 Apr 23 08:57:26.740231 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:57:26.740216 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 08:57:27.691118 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:57:27.691086 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-hzkbk/test-trainjob-c268h-node-0-0-c7gzd" event={"ID":"01609374-82db-43aa-a193-a88da0700b70","Type":"ContainerStarted","Data":"4dac38191cdc2dd1b4eabbe61bd35c45e2d6d6a699ff05d1f4467d9aef57a7c8"} Apr 23 08:58:04.922952 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:58:04.922924 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7ftql_8e2935cf-4651-42ac-bd9e-54accc810a7a/console-operator/1.log" Apr 23 08:58:04.924736 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:58:04.924712 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7ftql_8e2935cf-4651-42ac-bd9e-54accc810a7a/console-operator/1.log" Apr 23 08:58:04.925984 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:58:04.925961 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9bsnx_33cb4bff-3a7e-42a3-8d3d-e1c79d36437b/ovn-acl-logging/0.log" Apr 23 08:58:04.927734 ip-10-0-131-47 kubenswrapper[2579]: I0423 08:58:04.927715 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9bsnx_33cb4bff-3a7e-42a3-8d3d-e1c79d36437b/ovn-acl-logging/0.log" Apr 23 09:01:48.521259 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:01:48.521221 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-hzkbk/test-trainjob-c268h-node-0-0-c7gzd" event={"ID":"01609374-82db-43aa-a193-a88da0700b70","Type":"ContainerStarted","Data":"0c7dcc130ac1aebab79665099fd41ce1a442281c387d4900afac560cf783e157"} Apr 23 09:01:48.544934 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:01:48.544887 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="test-ns-hzkbk/test-trainjob-c268h-node-0-0-c7gzd" podStartSLOduration=1.803070987 podStartE2EDuration="4m22.544873648s" podCreationTimestamp="2026-04-23 08:57:26 +0000 UTC" firstStartedPulling="2026-04-23 08:57:26.740352703 +0000 UTC m=+562.269832721" lastFinishedPulling="2026-04-23 09:01:47.482155365 +0000 UTC m=+823.011635382" observedRunningTime="2026-04-23 09:01:48.543671361 +0000 UTC m=+824.073151388" watchObservedRunningTime="2026-04-23 09:01:48.544873648 +0000 UTC m=+824.074353673" Apr 23 09:01:53.536558 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:01:53.536525 2579 generic.go:358] "Generic (PLEG): container finished" podID="01609374-82db-43aa-a193-a88da0700b70" containerID="0c7dcc130ac1aebab79665099fd41ce1a442281c387d4900afac560cf783e157" exitCode=0 Apr 23 09:01:53.536558 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:01:53.536563 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-hzkbk/test-trainjob-c268h-node-0-0-c7gzd" event={"ID":"01609374-82db-43aa-a193-a88da0700b70","Type":"ContainerDied","Data":"0c7dcc130ac1aebab79665099fd41ce1a442281c387d4900afac560cf783e157"} Apr 23 09:01:54.803208 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:01:54.803184 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-hzkbk/test-trainjob-c268h-node-0-0-c7gzd" Apr 23 09:01:54.872975 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:01:54.872937 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tppzc\" (UniqueName: \"kubernetes.io/projected/01609374-82db-43aa-a193-a88da0700b70-kube-api-access-tppzc\") pod \"01609374-82db-43aa-a193-a88da0700b70\" (UID: \"01609374-82db-43aa-a193-a88da0700b70\") " Apr 23 09:01:54.875220 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:01:54.875184 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01609374-82db-43aa-a193-a88da0700b70-kube-api-access-tppzc" (OuterVolumeSpecName: "kube-api-access-tppzc") pod "01609374-82db-43aa-a193-a88da0700b70" (UID: "01609374-82db-43aa-a193-a88da0700b70"). InnerVolumeSpecName "kube-api-access-tppzc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 09:01:54.974245 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:01:54.974203 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tppzc\" (UniqueName: \"kubernetes.io/projected/01609374-82db-43aa-a193-a88da0700b70-kube-api-access-tppzc\") on node \"ip-10-0-131-47.ec2.internal\" DevicePath \"\"" Apr 23 09:01:55.543126 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:01:55.543092 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-hzkbk/test-trainjob-c268h-node-0-0-c7gzd" event={"ID":"01609374-82db-43aa-a193-a88da0700b70","Type":"ContainerDied","Data":"4dac38191cdc2dd1b4eabbe61bd35c45e2d6d6a699ff05d1f4467d9aef57a7c8"} Apr 23 09:01:55.543126 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:01:55.543125 2579 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4dac38191cdc2dd1b4eabbe61bd35c45e2d6d6a699ff05d1f4467d9aef57a7c8" Apr 23 09:01:55.543453 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:01:55.543134 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-hzkbk/test-trainjob-c268h-node-0-0-c7gzd" Apr 23 09:01:56.752374 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:01:56.752321 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["test-ns-vvs8w/test-trainjob-jzv82-node-0-0-m6srt"] Apr 23 09:01:56.752758 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:01:56.752648 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="01609374-82db-43aa-a193-a88da0700b70" containerName="node" Apr 23 09:01:56.752758 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:01:56.752660 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="01609374-82db-43aa-a193-a88da0700b70" containerName="node" Apr 23 09:01:56.752758 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:01:56.752726 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="01609374-82db-43aa-a193-a88da0700b70" containerName="node" Apr 23 09:01:56.765746 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:01:56.765716 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-vvs8w/test-trainjob-jzv82-node-0-0-m6srt"] Apr 23 09:01:56.765909 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:01:56.765826 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-vvs8w/test-trainjob-jzv82-node-0-0-m6srt" Apr 23 09:01:56.768102 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:01:56.768068 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-vvs8w\"/\"default-dockercfg-2wsz2\"" Apr 23 09:01:56.768821 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:01:56.768797 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-vvs8w\"/\"openshift-service-ca.crt\"" Apr 23 09:01:56.769040 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:01:56.769022 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-vvs8w\"/\"kube-root-ca.crt\"" Apr 23 09:01:56.786308 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:01:56.786274 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rprgv\" (UniqueName: \"kubernetes.io/projected/99a19758-0036-4341-bbab-24f247492d47-kube-api-access-rprgv\") pod \"test-trainjob-jzv82-node-0-0-m6srt\" (UID: \"99a19758-0036-4341-bbab-24f247492d47\") " pod="test-ns-vvs8w/test-trainjob-jzv82-node-0-0-m6srt" Apr 23 09:01:56.886863 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:01:56.886822 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rprgv\" (UniqueName: \"kubernetes.io/projected/99a19758-0036-4341-bbab-24f247492d47-kube-api-access-rprgv\") pod \"test-trainjob-jzv82-node-0-0-m6srt\" (UID: \"99a19758-0036-4341-bbab-24f247492d47\") " pod="test-ns-vvs8w/test-trainjob-jzv82-node-0-0-m6srt" Apr 23 09:01:56.895202 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:01:56.895175 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rprgv\" (UniqueName: \"kubernetes.io/projected/99a19758-0036-4341-bbab-24f247492d47-kube-api-access-rprgv\") pod \"test-trainjob-jzv82-node-0-0-m6srt\" (UID: \"99a19758-0036-4341-bbab-24f247492d47\") " pod="test-ns-vvs8w/test-trainjob-jzv82-node-0-0-m6srt" Apr 23 09:01:57.085582 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:01:57.085493 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-vvs8w/test-trainjob-jzv82-node-0-0-m6srt" Apr 23 09:01:57.276022 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:01:57.275994 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-vvs8w/test-trainjob-jzv82-node-0-0-m6srt"] Apr 23 09:01:57.279095 ip-10-0-131-47 kubenswrapper[2579]: W0423 09:01:57.279062 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod99a19758_0036_4341_bbab_24f247492d47.slice/crio-0bfd7eaa566958730f92c629e8deb97392ea6e1fab7c7e0f3854af9d33b2009f WatchSource:0}: Error finding container 0bfd7eaa566958730f92c629e8deb97392ea6e1fab7c7e0f3854af9d33b2009f: Status 404 returned error can't find the container with id 0bfd7eaa566958730f92c629e8deb97392ea6e1fab7c7e0f3854af9d33b2009f Apr 23 09:01:57.549776 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:01:57.549743 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-vvs8w/test-trainjob-jzv82-node-0-0-m6srt" event={"ID":"99a19758-0036-4341-bbab-24f247492d47","Type":"ContainerStarted","Data":"0bfd7eaa566958730f92c629e8deb97392ea6e1fab7c7e0f3854af9d33b2009f"} Apr 23 09:03:04.988829 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:03:04.988790 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7ftql_8e2935cf-4651-42ac-bd9e-54accc810a7a/console-operator/1.log" Apr 23 09:03:04.989376 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:03:04.988790 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7ftql_8e2935cf-4651-42ac-bd9e-54accc810a7a/console-operator/1.log" Apr 23 09:03:04.995298 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:03:04.995275 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9bsnx_33cb4bff-3a7e-42a3-8d3d-e1c79d36437b/ovn-acl-logging/0.log" Apr 23 09:03:04.995466 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:03:04.995280 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9bsnx_33cb4bff-3a7e-42a3-8d3d-e1c79d36437b/ovn-acl-logging/0.log" Apr 23 09:06:16.694252 ip-10-0-131-47 kubenswrapper[2579]: E0423 09:06:16.694195 2579 log.go:32] "ListImages with filter from image service failed" err="rpc error: code = DeadlineExceeded desc = context deadline exceeded" filter="nil" Apr 23 09:06:16.694252 ip-10-0-131-47 kubenswrapper[2579]: E0423 09:06:16.694255 2579 kuberuntime_image.go:104] "Failed to list images" err="rpc error: code = DeadlineExceeded desc = context deadline exceeded" Apr 23 09:06:16.694790 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:06:16.694266 2579 image_gc_manager.go:230] "Failed to update image list" err="rpc error: code = DeadlineExceeded desc = context deadline exceeded" Apr 23 09:06:36.398682 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:06:36.398645 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-vvs8w/test-trainjob-jzv82-node-0-0-m6srt" event={"ID":"99a19758-0036-4341-bbab-24f247492d47","Type":"ContainerStarted","Data":"569aab5c2a947939f1713198ec04a4bfec4710ff92b8e0a49adf6c3f584b39b4"} Apr 23 09:06:36.425074 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:06:36.425027 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="test-ns-vvs8w/test-trainjob-jzv82-node-0-0-m6srt" podStartSLOduration=2.145574383 podStartE2EDuration="4m40.425013077s" podCreationTimestamp="2026-04-23 09:01:56 +0000 UTC" firstStartedPulling="2026-04-23 09:01:57.281046958 +0000 UTC m=+832.810526962" lastFinishedPulling="2026-04-23 09:06:35.560485649 +0000 UTC m=+1111.089965656" observedRunningTime="2026-04-23 09:06:36.424005884 +0000 UTC m=+1111.953485920" watchObservedRunningTime="2026-04-23 09:06:36.425013077 +0000 UTC m=+1111.954493103" Apr 23 09:06:44.423666 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:06:44.423631 2579 generic.go:358] "Generic (PLEG): container finished" podID="99a19758-0036-4341-bbab-24f247492d47" containerID="569aab5c2a947939f1713198ec04a4bfec4710ff92b8e0a49adf6c3f584b39b4" exitCode=0 Apr 23 09:06:44.424157 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:06:44.423712 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-vvs8w/test-trainjob-jzv82-node-0-0-m6srt" event={"ID":"99a19758-0036-4341-bbab-24f247492d47","Type":"ContainerDied","Data":"569aab5c2a947939f1713198ec04a4bfec4710ff92b8e0a49adf6c3f584b39b4"} Apr 23 09:06:45.627144 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:06:45.627122 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-vvs8w/test-trainjob-jzv82-node-0-0-m6srt" Apr 23 09:06:45.709680 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:06:45.709596 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rprgv\" (UniqueName: \"kubernetes.io/projected/99a19758-0036-4341-bbab-24f247492d47-kube-api-access-rprgv\") pod \"99a19758-0036-4341-bbab-24f247492d47\" (UID: \"99a19758-0036-4341-bbab-24f247492d47\") " Apr 23 09:06:45.711872 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:06:45.711841 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99a19758-0036-4341-bbab-24f247492d47-kube-api-access-rprgv" (OuterVolumeSpecName: "kube-api-access-rprgv") pod "99a19758-0036-4341-bbab-24f247492d47" (UID: "99a19758-0036-4341-bbab-24f247492d47"). InnerVolumeSpecName "kube-api-access-rprgv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 09:06:45.810531 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:06:45.810483 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rprgv\" (UniqueName: \"kubernetes.io/projected/99a19758-0036-4341-bbab-24f247492d47-kube-api-access-rprgv\") on node \"ip-10-0-131-47.ec2.internal\" DevicePath \"\"" Apr 23 09:06:46.430760 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:06:46.430727 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-vvs8w/test-trainjob-jzv82-node-0-0-m6srt" Apr 23 09:06:46.430760 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:06:46.430738 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-vvs8w/test-trainjob-jzv82-node-0-0-m6srt" event={"ID":"99a19758-0036-4341-bbab-24f247492d47","Type":"ContainerDied","Data":"0bfd7eaa566958730f92c629e8deb97392ea6e1fab7c7e0f3854af9d33b2009f"} Apr 23 09:06:46.430760 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:06:46.430768 2579 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0bfd7eaa566958730f92c629e8deb97392ea6e1fab7c7e0f3854af9d33b2009f" Apr 23 09:08:05.012497 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:08:05.012465 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7ftql_8e2935cf-4651-42ac-bd9e-54accc810a7a/console-operator/1.log" Apr 23 09:08:05.013777 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:08:05.013751 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7ftql_8e2935cf-4651-42ac-bd9e-54accc810a7a/console-operator/1.log" Apr 23 09:08:05.018511 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:08:05.018490 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9bsnx_33cb4bff-3a7e-42a3-8d3d-e1c79d36437b/ovn-acl-logging/0.log" Apr 23 09:08:05.019711 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:08:05.019690 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9bsnx_33cb4bff-3a7e-42a3-8d3d-e1c79d36437b/ovn-acl-logging/0.log" Apr 23 09:08:14.747038 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:08:14.747010 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["test-ns-vrn2w/test-trainjob-x6qdq-node-0-0-w58zw"] Apr 23 09:08:14.747500 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:08:14.747281 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="99a19758-0036-4341-bbab-24f247492d47" containerName="node" Apr 23 09:08:14.747500 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:08:14.747292 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="99a19758-0036-4341-bbab-24f247492d47" containerName="node" Apr 23 09:08:14.747500 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:08:14.747385 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="99a19758-0036-4341-bbab-24f247492d47" containerName="node" Apr 23 09:08:14.749061 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:08:14.749044 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-vrn2w/test-trainjob-x6qdq-node-0-0-w58zw" Apr 23 09:08:14.751203 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:08:14.751182 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-vrn2w\"/\"kube-root-ca.crt\"" Apr 23 09:08:14.751333 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:08:14.751202 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-vrn2w\"/\"default-dockercfg-7t82d\"" Apr 23 09:08:14.751888 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:08:14.751870 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-vrn2w\"/\"openshift-service-ca.crt\"" Apr 23 09:08:14.757323 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:08:14.757294 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-vrn2w/test-trainjob-x6qdq-node-0-0-w58zw"] Apr 23 09:08:14.776714 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:08:14.776683 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ws52b\" (UniqueName: \"kubernetes.io/projected/dacf9d46-7e30-40ea-ae1b-f26fde285f98-kube-api-access-ws52b\") pod \"test-trainjob-x6qdq-node-0-0-w58zw\" (UID: \"dacf9d46-7e30-40ea-ae1b-f26fde285f98\") " pod="test-ns-vrn2w/test-trainjob-x6qdq-node-0-0-w58zw" Apr 23 09:08:14.877042 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:08:14.877015 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ws52b\" (UniqueName: \"kubernetes.io/projected/dacf9d46-7e30-40ea-ae1b-f26fde285f98-kube-api-access-ws52b\") pod \"test-trainjob-x6qdq-node-0-0-w58zw\" (UID: \"dacf9d46-7e30-40ea-ae1b-f26fde285f98\") " pod="test-ns-vrn2w/test-trainjob-x6qdq-node-0-0-w58zw" Apr 23 09:08:14.884282 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:08:14.884259 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ws52b\" (UniqueName: \"kubernetes.io/projected/dacf9d46-7e30-40ea-ae1b-f26fde285f98-kube-api-access-ws52b\") pod \"test-trainjob-x6qdq-node-0-0-w58zw\" (UID: \"dacf9d46-7e30-40ea-ae1b-f26fde285f98\") " pod="test-ns-vrn2w/test-trainjob-x6qdq-node-0-0-w58zw" Apr 23 09:08:15.069886 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:08:15.069806 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-vrn2w/test-trainjob-x6qdq-node-0-0-w58zw" Apr 23 09:08:15.247540 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:08:15.247513 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-vrn2w/test-trainjob-x6qdq-node-0-0-w58zw"] Apr 23 09:08:15.282673 ip-10-0-131-47 kubenswrapper[2579]: W0423 09:08:15.282644 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddacf9d46_7e30_40ea_ae1b_f26fde285f98.slice/crio-cf525ce0d4db3f1fac5262673035cb98c8dd432548f817bc9471670bf0767c95 WatchSource:0}: Error finding container cf525ce0d4db3f1fac5262673035cb98c8dd432548f817bc9471670bf0767c95: Status 404 returned error can't find the container with id cf525ce0d4db3f1fac5262673035cb98c8dd432548f817bc9471670bf0767c95 Apr 23 09:08:15.284181 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:08:15.284163 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 09:08:15.686856 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:08:15.686823 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-vrn2w/test-trainjob-x6qdq-node-0-0-w58zw" event={"ID":"dacf9d46-7e30-40ea-ae1b-f26fde285f98","Type":"ContainerStarted","Data":"cf525ce0d4db3f1fac5262673035cb98c8dd432548f817bc9471670bf0767c95"} Apr 23 09:13:05.034647 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:13:05.034620 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7ftql_8e2935cf-4651-42ac-bd9e-54accc810a7a/console-operator/1.log" Apr 23 09:13:05.037314 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:13:05.037056 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7ftql_8e2935cf-4651-42ac-bd9e-54accc810a7a/console-operator/1.log" Apr 23 09:13:05.037314 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:13:05.037254 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9bsnx_33cb4bff-3a7e-42a3-8d3d-e1c79d36437b/ovn-acl-logging/0.log" Apr 23 09:13:05.042303 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:13:05.042279 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9bsnx_33cb4bff-3a7e-42a3-8d3d-e1c79d36437b/ovn-acl-logging/0.log" Apr 23 09:15:21.965441 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:15:21.965402 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-vrn2w/test-trainjob-x6qdq-node-0-0-w58zw" event={"ID":"dacf9d46-7e30-40ea-ae1b-f26fde285f98","Type":"ContainerStarted","Data":"6c9b50a4f12b6a49a50b34f09a6b0407087aea447946fe27f64555b90eb935af"} Apr 23 09:15:21.967765 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:15:21.967740 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-vrn2w\"/\"default-dockercfg-7t82d\"" Apr 23 09:15:21.991431 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:15:21.991374 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="test-ns-vrn2w/test-trainjob-x6qdq-node-0-0-w58zw" podStartSLOduration=1.730152963 podStartE2EDuration="7m7.991357961s" podCreationTimestamp="2026-04-23 09:08:14 +0000 UTC" firstStartedPulling="2026-04-23 09:08:15.284298595 +0000 UTC m=+1210.813778598" lastFinishedPulling="2026-04-23 09:15:21.545503592 +0000 UTC m=+1637.074983596" observedRunningTime="2026-04-23 09:15:21.989013134 +0000 UTC m=+1637.518493161" watchObservedRunningTime="2026-04-23 09:15:21.991357961 +0000 UTC m=+1637.520837981" Apr 23 09:15:22.058595 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:15:22.058560 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-vrn2w\"/\"kube-root-ca.crt\"" Apr 23 09:15:22.068777 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:15:22.068753 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-vrn2w\"/\"openshift-service-ca.crt\"" Apr 23 09:15:25.978262 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:15:25.978230 2579 generic.go:358] "Generic (PLEG): container finished" podID="dacf9d46-7e30-40ea-ae1b-f26fde285f98" containerID="6c9b50a4f12b6a49a50b34f09a6b0407087aea447946fe27f64555b90eb935af" exitCode=0 Apr 23 09:15:25.978628 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:15:25.978289 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-vrn2w/test-trainjob-x6qdq-node-0-0-w58zw" event={"ID":"dacf9d46-7e30-40ea-ae1b-f26fde285f98","Type":"ContainerDied","Data":"6c9b50a4f12b6a49a50b34f09a6b0407087aea447946fe27f64555b90eb935af"} Apr 23 09:15:27.125200 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:15:27.125178 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-vrn2w/test-trainjob-x6qdq-node-0-0-w58zw" Apr 23 09:15:27.301732 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:15:27.301656 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ws52b\" (UniqueName: \"kubernetes.io/projected/dacf9d46-7e30-40ea-ae1b-f26fde285f98-kube-api-access-ws52b\") pod \"dacf9d46-7e30-40ea-ae1b-f26fde285f98\" (UID: \"dacf9d46-7e30-40ea-ae1b-f26fde285f98\") " Apr 23 09:15:27.303910 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:15:27.303881 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dacf9d46-7e30-40ea-ae1b-f26fde285f98-kube-api-access-ws52b" (OuterVolumeSpecName: "kube-api-access-ws52b") pod "dacf9d46-7e30-40ea-ae1b-f26fde285f98" (UID: "dacf9d46-7e30-40ea-ae1b-f26fde285f98"). InnerVolumeSpecName "kube-api-access-ws52b". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 09:15:27.402699 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:15:27.402669 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ws52b\" (UniqueName: \"kubernetes.io/projected/dacf9d46-7e30-40ea-ae1b-f26fde285f98-kube-api-access-ws52b\") on node \"ip-10-0-131-47.ec2.internal\" DevicePath \"\"" Apr 23 09:15:27.984828 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:15:27.984794 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-vrn2w/test-trainjob-x6qdq-node-0-0-w58zw" Apr 23 09:15:27.984828 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:15:27.984807 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-vrn2w/test-trainjob-x6qdq-node-0-0-w58zw" event={"ID":"dacf9d46-7e30-40ea-ae1b-f26fde285f98","Type":"ContainerDied","Data":"cf525ce0d4db3f1fac5262673035cb98c8dd432548f817bc9471670bf0767c95"} Apr 23 09:15:27.984828 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:15:27.984836 2579 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cf525ce0d4db3f1fac5262673035cb98c8dd432548f817bc9471670bf0767c95" Apr 23 09:15:28.989815 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:15:28.989785 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["test-ns-xlxm7/test-trainjob-dhfqz-node-0-0-7pgns"] Apr 23 09:15:28.990199 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:15:28.990093 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dacf9d46-7e30-40ea-ae1b-f26fde285f98" containerName="node" Apr 23 09:15:28.990199 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:15:28.990106 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="dacf9d46-7e30-40ea-ae1b-f26fde285f98" containerName="node" Apr 23 09:15:28.990199 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:15:28.990160 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="dacf9d46-7e30-40ea-ae1b-f26fde285f98" containerName="node" Apr 23 09:15:29.006842 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:15:29.006818 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-xlxm7/test-trainjob-dhfqz-node-0-0-7pgns"] Apr 23 09:15:29.006976 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:15:29.006914 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-xlxm7/test-trainjob-dhfqz-node-0-0-7pgns" Apr 23 09:15:29.009142 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:15:29.009119 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-xlxm7\"/\"kube-root-ca.crt\"" Apr 23 09:15:29.009664 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:15:29.009643 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-xlxm7\"/\"default-dockercfg-hkdwr\"" Apr 23 09:15:29.009759 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:15:29.009642 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-xlxm7\"/\"openshift-service-ca.crt\"" Apr 23 09:15:29.014605 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:15:29.014577 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnj2l\" (UniqueName: \"kubernetes.io/projected/d32359de-670f-4b45-afaf-9d47083cc013-kube-api-access-fnj2l\") pod \"test-trainjob-dhfqz-node-0-0-7pgns\" (UID: \"d32359de-670f-4b45-afaf-9d47083cc013\") " pod="test-ns-xlxm7/test-trainjob-dhfqz-node-0-0-7pgns" Apr 23 09:15:29.115090 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:15:29.115053 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fnj2l\" (UniqueName: \"kubernetes.io/projected/d32359de-670f-4b45-afaf-9d47083cc013-kube-api-access-fnj2l\") pod \"test-trainjob-dhfqz-node-0-0-7pgns\" (UID: \"d32359de-670f-4b45-afaf-9d47083cc013\") " pod="test-ns-xlxm7/test-trainjob-dhfqz-node-0-0-7pgns" Apr 23 09:15:29.123283 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:15:29.123254 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnj2l\" (UniqueName: \"kubernetes.io/projected/d32359de-670f-4b45-afaf-9d47083cc013-kube-api-access-fnj2l\") pod \"test-trainjob-dhfqz-node-0-0-7pgns\" (UID: \"d32359de-670f-4b45-afaf-9d47083cc013\") " pod="test-ns-xlxm7/test-trainjob-dhfqz-node-0-0-7pgns" Apr 23 09:15:29.327684 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:15:29.327602 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-xlxm7/test-trainjob-dhfqz-node-0-0-7pgns" Apr 23 09:15:29.449901 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:15:29.449873 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-xlxm7/test-trainjob-dhfqz-node-0-0-7pgns"] Apr 23 09:15:29.452195 ip-10-0-131-47 kubenswrapper[2579]: W0423 09:15:29.452165 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd32359de_670f_4b45_afaf_9d47083cc013.slice/crio-2dcd3710f4d35bf9e21d1bd943fa4671270fe1b9bbaf8d65ba404237e267e9d1 WatchSource:0}: Error finding container 2dcd3710f4d35bf9e21d1bd943fa4671270fe1b9bbaf8d65ba404237e267e9d1: Status 404 returned error can't find the container with id 2dcd3710f4d35bf9e21d1bd943fa4671270fe1b9bbaf8d65ba404237e267e9d1 Apr 23 09:15:29.454056 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:15:29.454037 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 09:15:29.990911 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:15:29.990871 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-xlxm7/test-trainjob-dhfqz-node-0-0-7pgns" event={"ID":"d32359de-670f-4b45-afaf-9d47083cc013","Type":"ContainerStarted","Data":"2dcd3710f4d35bf9e21d1bd943fa4671270fe1b9bbaf8d65ba404237e267e9d1"} Apr 23 09:18:05.060725 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:18:05.060616 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7ftql_8e2935cf-4651-42ac-bd9e-54accc810a7a/console-operator/1.log" Apr 23 09:18:05.064795 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:18:05.062593 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7ftql_8e2935cf-4651-42ac-bd9e-54accc810a7a/console-operator/1.log" Apr 23 09:18:05.064795 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:18:05.063645 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9bsnx_33cb4bff-3a7e-42a3-8d3d-e1c79d36437b/ovn-acl-logging/0.log" Apr 23 09:18:05.065642 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:18:05.065621 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9bsnx_33cb4bff-3a7e-42a3-8d3d-e1c79d36437b/ovn-acl-logging/0.log" Apr 23 09:23:21.367009 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:23:21.366972 2579 eviction_manager.go:376] "Eviction manager: attempting to reclaim" resourceName="ephemeral-storage" Apr 23 09:23:21.531425 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:23:21.367031 2579 container_gc.go:86] "Attempting to delete unused containers" Apr 23 09:23:21.531425 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:23:21.368600 2579 scope.go:117] "RemoveContainer" containerID="6c9b50a4f12b6a49a50b34f09a6b0407087aea447946fe27f64555b90eb935af" Apr 23 09:23:27.104478 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:23:27.104442 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-47.ec2.internal" event="NodeHasDiskPressure" Apr 23 09:23:50.983631 ip-10-0-131-47 kubenswrapper[2579]: E0423 09:23:50.983585 2579 log.go:32] "ListImages with filter from image service failed" err="rpc error: code = DeadlineExceeded desc = stream terminated by RST_STREAM with error code: CANCEL" filter="nil" Apr 23 09:23:50.983631 ip-10-0-131-47 kubenswrapper[2579]: E0423 09:23:50.983632 2579 kuberuntime_image.go:104] "Failed to list images" err="rpc error: code = DeadlineExceeded desc = stream terminated by RST_STREAM with error code: CANCEL" Apr 23 09:23:50.984109 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:23:50.983644 2579 image_gc_manager.go:230] "Failed to update image list" err="rpc error: code = DeadlineExceeded desc = stream terminated by RST_STREAM with error code: CANCEL" Apr 23 09:25:05.069783 ip-10-0-131-47 kubenswrapper[2579]: E0423 09:25:05.069741 2579 log.go:32] "ListImages with filter from image service failed" err="rpc error: code = DeadlineExceeded desc = stream terminated by RST_STREAM with error code: CANCEL" filter="nil" Apr 23 09:25:05.069783 ip-10-0-131-47 kubenswrapper[2579]: E0423 09:25:05.069786 2579 kuberuntime_image.go:104] "Failed to list images" err="rpc error: code = DeadlineExceeded desc = stream terminated by RST_STREAM with error code: CANCEL" Apr 23 09:25:05.070285 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:25:05.069798 2579 image_gc_manager.go:222] "Failed to monitor images" err="rpc error: code = DeadlineExceeded desc = stream terminated by RST_STREAM with error code: CANCEL" Apr 23 09:25:05.073895 ip-10-0-131-47 kubenswrapper[2579]: E0423 09:25:05.073870 2579 log.go:32] "ListImages with filter from image service failed" err="rpc error: code = DeadlineExceeded desc = context deadline exceeded" filter="nil" Apr 23 09:25:05.073975 ip-10-0-131-47 kubenswrapper[2579]: E0423 09:25:05.073902 2579 kuberuntime_image.go:104] "Failed to list images" err="rpc error: code = DeadlineExceeded desc = context deadline exceeded" Apr 23 09:25:05.073975 ip-10-0-131-47 kubenswrapper[2579]: E0423 09:25:05.073913 2579 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="rpc error: code = DeadlineExceeded desc = context deadline exceeded" Apr 23 09:25:21.369646 ip-10-0-131-47 kubenswrapper[2579]: E0423 09:25:21.369593 2579 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = DeadlineExceeded desc = context deadline exceeded" containerID="6c9b50a4f12b6a49a50b34f09a6b0407087aea447946fe27f64555b90eb935af" Apr 23 09:25:21.370158 ip-10-0-131-47 kubenswrapper[2579]: E0423 09:25:21.369657 2579 kuberuntime_gc.go:150] "Failed to remove container" err="rpc error: code = DeadlineExceeded desc = context deadline exceeded" containerID="6c9b50a4f12b6a49a50b34f09a6b0407087aea447946fe27f64555b90eb935af" Apr 23 09:25:21.370158 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:25:21.369686 2579 scope.go:117] "RemoveContainer" containerID="569aab5c2a947939f1713198ec04a4bfec4710ff92b8e0a49adf6c3f584b39b4" Apr 23 09:26:20.985388 ip-10-0-131-47 kubenswrapper[2579]: E0423 09:26:20.985325 2579 log.go:32] "ListImages with filter from image service failed" err="rpc error: code = DeadlineExceeded desc = context deadline exceeded" filter="nil" Apr 23 09:26:20.985388 ip-10-0-131-47 kubenswrapper[2579]: E0423 09:26:20.985390 2579 kuberuntime_image.go:104] "Failed to list images" err="rpc error: code = DeadlineExceeded desc = context deadline exceeded" Apr 23 09:26:20.985890 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:26:20.985404 2579 image_gc_manager.go:230] "Failed to update image list" err="rpc error: code = DeadlineExceeded desc = context deadline exceeded" Apr 23 09:26:24.095916 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:26:24.095884 2579 scope.go:117] "RemoveContainer" containerID="0c7dcc130ac1aebab79665099fd41ce1a442281c387d4900afac560cf783e157" Apr 23 09:26:24.170333 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:26:24.170309 2579 image_gc_manager.go:447] "Attempting to delete unused images" Apr 23 09:26:24.184697 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:26:24.184672 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7ftql_8e2935cf-4651-42ac-bd9e-54accc810a7a/console-operator/1.log" Apr 23 09:26:24.187138 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:26:24.187115 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9bsnx_33cb4bff-3a7e-42a3-8d3d-e1c79d36437b/ovn-acl-logging/0.log" Apr 23 09:26:24.191541 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:26:24.191509 2579 image_gc_manager.go:514] "Removing image to free bytes" imageID="5151a6030289f6d1ef2c984ebd3e465632a3bf64de79db6f7b3d6e2e638b0557" size=1065600018 runtimeHandler="" Apr 23 09:26:24.320878 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:26:24.320806 2579 image_gc_manager.go:514] "Removing image to free bytes" imageID="ac4be6c7a52584c773ae754a4ccfb9fb1db440f4c9d858ad0f78765a85625b4b" size=1065006420 runtimeHandler="" Apr 23 09:26:27.062201 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:26:27.062160 2579 image_gc_manager.go:514] "Removing image to free bytes" imageID="bd2f0c6a473dfa650b536cfe1992446bf45305b3ace698398143f161694113a5" size=20806872103 runtimeHandler="" Apr 23 09:26:32.518351 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:26:32.518317 2579 image_gc_manager.go:514] "Removing image to free bytes" imageID="7e65b8288e37c3f4fac04e8bf51240765caae34795b317d44d5399762a08b761" size=23201654702 runtimeHandler="" Apr 23 09:26:38.592179 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:26:38.592141 2579 image_gc_manager.go:514] "Removing image to free bytes" imageID="819e15fdec92d846e6d5de4b1b2988adcb74f6d3046689fe03c655b03a67975d" size=18873458221 runtimeHandler="" Apr 23 09:26:41.518336 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:26:41.518300 2579 eviction_manager.go:383] "Eviction manager: able to reduce resource pressure without evicting pods." resourceName="ephemeral-storage" Apr 23 09:26:42.218363 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:26:42.218313 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-xlxm7/test-trainjob-dhfqz-node-0-0-7pgns" event={"ID":"d32359de-670f-4b45-afaf-9d47083cc013","Type":"ContainerStarted","Data":"466d89fa37186c4be922ad3f533bc251f600aef70124cfa988f199ecaa9cb198"} Apr 23 09:26:42.220587 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:26:42.220567 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-xlxm7\"/\"default-dockercfg-hkdwr\"" Apr 23 09:26:42.243730 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:26:42.243669 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="test-ns-xlxm7/test-trainjob-dhfqz-node-0-0-7pgns" podStartSLOduration=5.105430675 podStartE2EDuration="11m14.243650301s" podCreationTimestamp="2026-04-23 09:15:28 +0000 UTC" firstStartedPulling="2026-04-23 09:15:29.454163795 +0000 UTC m=+1644.983643798" lastFinishedPulling="2026-04-23 09:26:38.592383416 +0000 UTC m=+2314.121863424" observedRunningTime="2026-04-23 09:26:42.242592293 +0000 UTC m=+2317.772072318" watchObservedRunningTime="2026-04-23 09:26:42.243650301 +0000 UTC m=+2317.773130328" Apr 23 09:26:42.391387 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:26:42.391325 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-xlxm7\"/\"kube-root-ca.crt\"" Apr 23 09:26:42.402552 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:26:42.402492 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-xlxm7\"/\"openshift-service-ca.crt\"" Apr 23 09:26:58.267022 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:26:58.266987 2579 generic.go:358] "Generic (PLEG): container finished" podID="d32359de-670f-4b45-afaf-9d47083cc013" containerID="466d89fa37186c4be922ad3f533bc251f600aef70124cfa988f199ecaa9cb198" exitCode=0 Apr 23 09:26:58.267475 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:26:58.267040 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-xlxm7/test-trainjob-dhfqz-node-0-0-7pgns" event={"ID":"d32359de-670f-4b45-afaf-9d47083cc013","Type":"ContainerDied","Data":"466d89fa37186c4be922ad3f533bc251f600aef70124cfa988f199ecaa9cb198"} Apr 23 09:26:59.400064 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:26:59.400040 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-xlxm7/test-trainjob-dhfqz-node-0-0-7pgns" Apr 23 09:26:59.423392 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:26:59.423358 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fnj2l\" (UniqueName: \"kubernetes.io/projected/d32359de-670f-4b45-afaf-9d47083cc013-kube-api-access-fnj2l\") pod \"d32359de-670f-4b45-afaf-9d47083cc013\" (UID: \"d32359de-670f-4b45-afaf-9d47083cc013\") " Apr 23 09:26:59.425594 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:26:59.425566 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d32359de-670f-4b45-afaf-9d47083cc013-kube-api-access-fnj2l" (OuterVolumeSpecName: "kube-api-access-fnj2l") pod "d32359de-670f-4b45-afaf-9d47083cc013" (UID: "d32359de-670f-4b45-afaf-9d47083cc013"). InnerVolumeSpecName "kube-api-access-fnj2l". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 09:26:59.524762 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:26:59.524670 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fnj2l\" (UniqueName: \"kubernetes.io/projected/d32359de-670f-4b45-afaf-9d47083cc013-kube-api-access-fnj2l\") on node \"ip-10-0-131-47.ec2.internal\" DevicePath \"\"" Apr 23 09:27:00.273737 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:27:00.273703 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-xlxm7/test-trainjob-dhfqz-node-0-0-7pgns" Apr 23 09:27:00.273737 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:27:00.273724 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-xlxm7/test-trainjob-dhfqz-node-0-0-7pgns" event={"ID":"d32359de-670f-4b45-afaf-9d47083cc013","Type":"ContainerDied","Data":"2dcd3710f4d35bf9e21d1bd943fa4671270fe1b9bbaf8d65ba404237e267e9d1"} Apr 23 09:27:00.273947 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:27:00.273757 2579 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2dcd3710f4d35bf9e21d1bd943fa4671270fe1b9bbaf8d65ba404237e267e9d1" Apr 23 09:27:01.329938 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:27:01.329909 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/test-ns-xlxm7_test-trainjob-dhfqz-node-0-0-7pgns_d32359de-670f-4b45-afaf-9d47083cc013/node/0.log" Apr 23 09:27:01.414161 ip-10-0-131-47 kubenswrapper[2579]: E0423 09:27:01.414126 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c9b50a4f12b6a49a50b34f09a6b0407087aea447946fe27f64555b90eb935af\": container with ID starting with 6c9b50a4f12b6a49a50b34f09a6b0407087aea447946fe27f64555b90eb935af not found: ID does not exist" containerID="6c9b50a4f12b6a49a50b34f09a6b0407087aea447946fe27f64555b90eb935af" Apr 23 09:27:01.616252 ip-10-0-131-47 kubenswrapper[2579]: E0423 09:27:01.616172 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"569aab5c2a947939f1713198ec04a4bfec4710ff92b8e0a49adf6c3f584b39b4\": container with ID starting with 569aab5c2a947939f1713198ec04a4bfec4710ff92b8e0a49adf6c3f584b39b4 not found: ID does not exist" containerID="569aab5c2a947939f1713198ec04a4bfec4710ff92b8e0a49adf6c3f584b39b4" Apr 23 09:27:02.111668 ip-10-0-131-47 kubenswrapper[2579]: E0423 09:27:02.111637 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c7dcc130ac1aebab79665099fd41ce1a442281c387d4900afac560cf783e157\": container with ID starting with 0c7dcc130ac1aebab79665099fd41ce1a442281c387d4900afac560cf783e157 not found: ID does not exist" containerID="0c7dcc130ac1aebab79665099fd41ce1a442281c387d4900afac560cf783e157" Apr 23 09:27:06.361522 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:27:06.361483 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["test-ns-xlxm7/test-trainjob-dhfqz-node-0-0-7pgns"] Apr 23 09:27:06.362941 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:27:06.362918 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["test-ns-xlxm7/test-trainjob-dhfqz-node-0-0-7pgns"] Apr 23 09:27:06.460951 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:27:06.460918 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["test-ns-vrn2w/test-trainjob-x6qdq-node-0-0-w58zw"] Apr 23 09:27:06.464309 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:27:06.464284 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["test-ns-vrn2w/test-trainjob-x6qdq-node-0-0-w58zw"] Apr 23 09:27:06.726195 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:27:06.726158 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["test-ns-vvs8w/test-trainjob-jzv82-node-0-0-m6srt"] Apr 23 09:27:06.727978 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:27:06.727954 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["test-ns-vvs8w/test-trainjob-jzv82-node-0-0-m6srt"] Apr 23 09:27:07.016091 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:27:07.016014 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99a19758-0036-4341-bbab-24f247492d47" path="/var/lib/kubelet/pods/99a19758-0036-4341-bbab-24f247492d47/volumes" Apr 23 09:27:07.016336 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:27:07.016322 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d32359de-670f-4b45-afaf-9d47083cc013" path="/var/lib/kubelet/pods/d32359de-670f-4b45-afaf-9d47083cc013/volumes" Apr 23 09:27:07.016615 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:27:07.016603 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dacf9d46-7e30-40ea-ae1b-f26fde285f98" path="/var/lib/kubelet/pods/dacf9d46-7e30-40ea-ae1b-f26fde285f98/volumes" Apr 23 09:27:07.329455 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:27:07.329383 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["test-ns-hzkbk/test-trainjob-c268h-node-0-0-c7gzd"] Apr 23 09:27:07.332486 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:27:07.332462 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["test-ns-hzkbk/test-trainjob-c268h-node-0-0-c7gzd"] Apr 23 09:27:09.015696 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:27:09.015662 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01609374-82db-43aa-a193-a88da0700b70" path="/var/lib/kubelet/pods/01609374-82db-43aa-a193-a88da0700b70/volumes" Apr 23 09:27:54.185354 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:27:54.185315 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-29qjf/must-gather-9ql47"] Apr 23 09:27:54.185838 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:27:54.185620 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d32359de-670f-4b45-afaf-9d47083cc013" containerName="node" Apr 23 09:27:54.185838 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:27:54.185632 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="d32359de-670f-4b45-afaf-9d47083cc013" containerName="node" Apr 23 09:27:54.185838 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:27:54.185680 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="d32359de-670f-4b45-afaf-9d47083cc013" containerName="node" Apr 23 09:27:54.188649 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:27:54.188627 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-29qjf/must-gather-9ql47" Apr 23 09:27:54.191965 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:27:54.191943 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-29qjf\"/\"kube-root-ca.crt\"" Apr 23 09:27:54.192563 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:27:54.192541 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-29qjf\"/\"default-dockercfg-nnl8j\"" Apr 23 09:27:54.197054 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:27:54.197030 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-29qjf\"/\"openshift-service-ca.crt\"" Apr 23 09:27:54.199607 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:27:54.199587 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-29qjf/must-gather-9ql47"] Apr 23 09:27:54.285915 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:27:54.285866 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjn8q\" (UniqueName: \"kubernetes.io/projected/8054e630-9189-4839-8d29-76097f817697-kube-api-access-gjn8q\") pod \"must-gather-9ql47\" (UID: \"8054e630-9189-4839-8d29-76097f817697\") " pod="openshift-must-gather-29qjf/must-gather-9ql47" Apr 23 09:27:54.285915 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:27:54.285924 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8054e630-9189-4839-8d29-76097f817697-must-gather-output\") pod \"must-gather-9ql47\" (UID: \"8054e630-9189-4839-8d29-76097f817697\") " pod="openshift-must-gather-29qjf/must-gather-9ql47" Apr 23 09:27:54.386635 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:27:54.386585 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gjn8q\" (UniqueName: \"kubernetes.io/projected/8054e630-9189-4839-8d29-76097f817697-kube-api-access-gjn8q\") pod \"must-gather-9ql47\" (UID: \"8054e630-9189-4839-8d29-76097f817697\") " pod="openshift-must-gather-29qjf/must-gather-9ql47" Apr 23 09:27:54.386837 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:27:54.386668 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8054e630-9189-4839-8d29-76097f817697-must-gather-output\") pod \"must-gather-9ql47\" (UID: \"8054e630-9189-4839-8d29-76097f817697\") " pod="openshift-must-gather-29qjf/must-gather-9ql47" Apr 23 09:27:54.387093 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:27:54.387069 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8054e630-9189-4839-8d29-76097f817697-must-gather-output\") pod \"must-gather-9ql47\" (UID: \"8054e630-9189-4839-8d29-76097f817697\") " pod="openshift-must-gather-29qjf/must-gather-9ql47" Apr 23 09:27:54.394624 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:27:54.394590 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjn8q\" (UniqueName: \"kubernetes.io/projected/8054e630-9189-4839-8d29-76097f817697-kube-api-access-gjn8q\") pod \"must-gather-9ql47\" (UID: \"8054e630-9189-4839-8d29-76097f817697\") " pod="openshift-must-gather-29qjf/must-gather-9ql47" Apr 23 09:27:54.515371 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:27:54.515241 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-29qjf/must-gather-9ql47" Apr 23 09:27:54.640432 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:27:54.640402 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-29qjf/must-gather-9ql47"] Apr 23 09:27:54.643000 ip-10-0-131-47 kubenswrapper[2579]: W0423 09:27:54.642972 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8054e630_9189_4839_8d29_76097f817697.slice/crio-762bd29f999139ab24cd0b2cc2ab286e0d401dd2fc549225e54cfdeb847e5662 WatchSource:0}: Error finding container 762bd29f999139ab24cd0b2cc2ab286e0d401dd2fc549225e54cfdeb847e5662: Status 404 returned error can't find the container with id 762bd29f999139ab24cd0b2cc2ab286e0d401dd2fc549225e54cfdeb847e5662 Apr 23 09:27:54.644908 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:27:54.644891 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 09:27:55.436140 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:27:55.436102 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-29qjf/must-gather-9ql47" event={"ID":"8054e630-9189-4839-8d29-76097f817697","Type":"ContainerStarted","Data":"762bd29f999139ab24cd0b2cc2ab286e0d401dd2fc549225e54cfdeb847e5662"} Apr 23 09:27:56.442688 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:27:56.442647 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-29qjf/must-gather-9ql47" event={"ID":"8054e630-9189-4839-8d29-76097f817697","Type":"ContainerStarted","Data":"30adf7df0d5cdd3393cb20ce34e62869e241abc6076541262f5568a2db433ac1"} Apr 23 09:27:56.443238 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:27:56.443214 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-29qjf/must-gather-9ql47" event={"ID":"8054e630-9189-4839-8d29-76097f817697","Type":"ContainerStarted","Data":"fa08b4b7f5dd000ca2446de85a61c1a51b5f646a9afd93b12732a0ff60cd4b64"} Apr 23 09:27:56.460792 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:27:56.460731 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-29qjf/must-gather-9ql47" podStartSLOduration=1.609174666 podStartE2EDuration="2.460713203s" podCreationTimestamp="2026-04-23 09:27:54 +0000 UTC" firstStartedPulling="2026-04-23 09:27:54.645022093 +0000 UTC m=+2390.174502096" lastFinishedPulling="2026-04-23 09:27:55.496560629 +0000 UTC m=+2391.026040633" observedRunningTime="2026-04-23 09:27:56.458672895 +0000 UTC m=+2391.988152921" watchObservedRunningTime="2026-04-23 09:27:56.460713203 +0000 UTC m=+2391.990193229" Apr 23 09:27:57.136416 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:27:57.136380 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-q6kd2_f0ec8bf3-b2b1-4ced-9317-2706c95af066/global-pull-secret-syncer/0.log" Apr 23 09:27:57.224035 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:27:57.223992 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-6hwnm_52011e07-e36d-48b0-bed0-421685c0e544/konnectivity-agent/0.log" Apr 23 09:27:57.333897 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:27:57.333862 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-131-47.ec2.internal_8b712f12e316b1a6ded9d349ca82d37a/haproxy/0.log" Apr 23 09:28:00.366066 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:00.365989 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-qh5vs_450d7653-7e13-4231-acf1-3736103fe1fd/cluster-monitoring-operator/0.log" Apr 23 09:28:00.503393 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:00.503256 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-9sm58_ec1f46f2-d7b6-4592-b515-562c3385bb81/monitoring-plugin/0.log" Apr 23 09:28:00.716194 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:00.716162 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-n7qsk_ff66f732-0549-4282-a494-c8a8ade9825d/node-exporter/0.log" Apr 23 09:28:00.743811 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:00.743767 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-n7qsk_ff66f732-0549-4282-a494-c8a8ade9825d/kube-rbac-proxy/0.log" Apr 23 09:28:00.778672 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:00.778591 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-n7qsk_ff66f732-0549-4282-a494-c8a8ade9825d/init-textfile/0.log" Apr 23 09:28:00.931700 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:00.931674 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_2b8a74f2-1318-418e-9416-b9da2d995059/prometheus/0.log" Apr 23 09:28:00.957011 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:00.956982 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_2b8a74f2-1318-418e-9416-b9da2d995059/config-reloader/0.log" Apr 23 09:28:00.985021 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:00.984994 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_2b8a74f2-1318-418e-9416-b9da2d995059/thanos-sidecar/0.log" Apr 23 09:28:01.011595 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:01.011564 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_2b8a74f2-1318-418e-9416-b9da2d995059/kube-rbac-proxy-web/0.log" Apr 23 09:28:01.042740 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:01.042649 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_2b8a74f2-1318-418e-9416-b9da2d995059/kube-rbac-proxy/0.log" Apr 23 09:28:01.075399 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:01.075372 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_2b8a74f2-1318-418e-9416-b9da2d995059/kube-rbac-proxy-thanos/0.log" Apr 23 09:28:01.106095 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:01.106057 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_2b8a74f2-1318-418e-9416-b9da2d995059/init-config-reloader/0.log" Apr 23 09:28:01.150518 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:01.150476 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-dgflf_825d2a5f-bf4f-4230-a808-8ac94f92aa8b/prometheus-operator/0.log" Apr 23 09:28:01.175626 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:01.175588 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-dgflf_825d2a5f-bf4f-4230-a808-8ac94f92aa8b/kube-rbac-proxy/0.log" Apr 23 09:28:03.101120 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:03.101092 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7ftql_8e2935cf-4651-42ac-bd9e-54accc810a7a/console-operator/1.log" Apr 23 09:28:03.107780 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:03.107755 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7ftql_8e2935cf-4651-42ac-bd9e-54accc810a7a/console-operator/2.log" Apr 23 09:28:03.528745 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:03.528711 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-4f5nb_6da55603-9712-4719-b29b-d9feec7ef27f/download-server/0.log" Apr 23 09:28:04.300621 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:04.300579 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-29qjf/perf-node-gather-daemonset-fv82d"] Apr 23 09:28:04.301495 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:04.301467 2579 kubelet.go:2420] "Pod admission denied" podUID="a10b6daf-3a0a-4a05-afdc-dfdf6f4ac43e" pod="openshift-must-gather-29qjf/perf-node-gather-daemonset-fv82d" reason="Evicted" message="The node had condition: [DiskPressure]. " Apr 23 09:28:04.314467 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:04.314436 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-29qjf/perf-node-gather-daemonset-fv82d"] Apr 23 09:28:04.314648 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:04.314559 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-29qjf/perf-node-gather-daemonset-fv82d" Apr 23 09:28:04.334132 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:04.334101 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-29qjf/perf-node-gather-daemonset-fv82d"] Apr 23 09:28:04.337872 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:04.337844 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-29qjf/perf-node-gather-daemonset-fv82d"] Apr 23 09:28:04.339558 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:04.339497 2579 status_manager.go:919] "Failed to update status for pod" pod="openshift-must-gather-29qjf/perf-node-gather-daemonset-fv82d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a10b6daf-3a0a-4a05-afdc-dfdf6f4ac43e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-23T09:28:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-23T09:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-23T09:28:04Z\\\",\\\"reason\\\":\\\"PodFailed\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-23T09:28:04Z\\\",\\\"reason\\\":\\\"PodFailed\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-23T09:28:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8f2123db9953f98da0e43c266e6a8a070bf221533b995f87b7e358cc7498ca6d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-probe\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/sys\\\",\\\"name\\\":\\\"sys\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/proc\\\",\\\"name\\\":\\\"proc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/lib/modules\\\",\\\"name\\\":\\\"lib-modules\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/podresources\\\",\\\"name\\\":\\\"podres\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnqz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"10.0.131.47\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"10.0.131.47\\\"}]}}\" for pod \"openshift-must-gather-29qjf\"/\"perf-node-gather-daemonset-fv82d\": pods \"perf-node-gather-daemonset-fv82d\" not found" Apr 23 09:28:04.366932 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:04.366895 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-29qjf/perf-node-gather-daemonset-kk8wn"] Apr 23 09:28:04.367263 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:04.367242 2579 kubelet.go:2420] "Pod admission denied" podUID="86f4bdd5-8b81-4857-90f7-3ed51b9b5298" pod="openshift-must-gather-29qjf/perf-node-gather-daemonset-kk8wn" reason="Evicted" message="The node had condition: [DiskPressure]. " Apr 23 09:28:04.378162 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:04.378131 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-29qjf/perf-node-gather-daemonset-kk8wn"] Apr 23 09:28:04.378382 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:04.378250 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-29qjf/perf-node-gather-daemonset-kk8wn" Apr 23 09:28:04.391455 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:04.391417 2579 status_manager.go:895] "Failed to get status for pod" podUID="a10b6daf-3a0a-4a05-afdc-dfdf6f4ac43e" pod="openshift-must-gather-29qjf/perf-node-gather-daemonset-fv82d" err="pods \"perf-node-gather-daemonset-fv82d\" is forbidden: User \"system:node:ip-10-0-131-47.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-29qjf\": no relationship found between node 'ip-10-0-131-47.ec2.internal' and this object" Apr 23 09:28:04.409964 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:04.409911 2579 status_manager.go:895] "Failed to get status for pod" podUID="a10b6daf-3a0a-4a05-afdc-dfdf6f4ac43e" pod="openshift-must-gather-29qjf/perf-node-gather-daemonset-fv82d" err="pods \"perf-node-gather-daemonset-fv82d\" is forbidden: User \"system:node:ip-10-0-131-47.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-29qjf\": no relationship found between node 'ip-10-0-131-47.ec2.internal' and this object" Apr 23 09:28:04.471223 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:04.471183 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-29qjf/perf-node-gather-daemonset-kk8wn" Apr 23 09:28:04.471431 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:04.471183 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-29qjf/perf-node-gather-daemonset-fv82d" Apr 23 09:28:04.475531 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:04.475507 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-29qjf/perf-node-gather-daemonset-kk8wn" Apr 23 09:28:04.478607 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:04.478589 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-29qjf/perf-node-gather-daemonset-fv82d" Apr 23 09:28:04.784560 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:04.784529 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-sr575_533a84b7-1e94-4312-8393-c6c787af53b6/dns/0.log" Apr 23 09:28:04.820674 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:04.820646 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-sr575_533a84b7-1e94-4312-8393-c6c787af53b6/kube-rbac-proxy/0.log" Apr 23 09:28:04.908328 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:04.908300 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-c68x8_6d76e4ef-b56a-40fd-9c37-6eb55602fea4/dns-node-resolver/0.log" Apr 23 09:28:05.016190 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:05.016139 2579 status_manager.go:895] "Failed to get status for pod" podUID="a10b6daf-3a0a-4a05-afdc-dfdf6f4ac43e" pod="openshift-must-gather-29qjf/perf-node-gather-daemonset-fv82d" err="pods \"perf-node-gather-daemonset-fv82d\" is forbidden: User \"system:node:ip-10-0-131-47.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-29qjf\": no relationship found between node 'ip-10-0-131-47.ec2.internal' and this object" Apr 23 09:28:05.399430 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:05.399397 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-29qjf/perf-node-gather-daemonset-kk8wn"] Apr 23 09:28:05.406884 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:05.406854 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-29qjf/perf-node-gather-daemonset-kk8wn"] Apr 23 09:28:05.413054 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:05.413023 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-6d4d878cc9-fkk7r_3dc2497b-5b53-4e93-ab87-8509553bef5d/registry/0.log" Apr 23 09:28:05.443777 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:05.443737 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-29qjf/perf-node-gather-daemonset-blz29"] Apr 23 09:28:05.444102 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:05.444080 2579 kubelet.go:2420] "Pod admission denied" podUID="60d4bd02-8909-46cc-9aed-87c7e3dd5be6" pod="openshift-must-gather-29qjf/perf-node-gather-daemonset-blz29" reason="Evicted" message="The node had condition: [DiskPressure]. " Apr 23 09:28:05.459082 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:05.459046 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-29qjf/perf-node-gather-daemonset-blz29"] Apr 23 09:28:05.459252 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:05.459181 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-29qjf/perf-node-gather-daemonset-blz29" Apr 23 09:28:05.474751 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:05.474688 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-29qjf/perf-node-gather-daemonset-kk8wn" Apr 23 09:28:05.474929 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:05.474857 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-29qjf/perf-node-gather-daemonset-fv82d" Apr 23 09:28:05.474999 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:05.474866 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-29qjf/perf-node-gather-daemonset-blz29" Apr 23 09:28:05.482688 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:05.482664 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-29qjf/perf-node-gather-daemonset-blz29" Apr 23 09:28:05.493613 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:05.493573 2579 status_manager.go:895] "Failed to get status for pod" podUID="86f4bdd5-8b81-4857-90f7-3ed51b9b5298" pod="openshift-must-gather-29qjf/perf-node-gather-daemonset-kk8wn" err="pods \"perf-node-gather-daemonset-kk8wn\" is forbidden: User \"system:node:ip-10-0-131-47.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-29qjf\": no relationship found between node 'ip-10-0-131-47.ec2.internal' and this object" Apr 23 09:28:05.495249 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:05.495218 2579 status_manager.go:895] "Failed to get status for pod" podUID="a10b6daf-3a0a-4a05-afdc-dfdf6f4ac43e" pod="openshift-must-gather-29qjf/perf-node-gather-daemonset-fv82d" err="pods \"perf-node-gather-daemonset-fv82d\" is forbidden: User \"system:node:ip-10-0-131-47.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-29qjf\": no relationship found between node 'ip-10-0-131-47.ec2.internal' and this object" Apr 23 09:28:05.547310 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:05.547280 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-z6xdx_a0aeb874-3a11-4639-a579-36512ba94069/node-ca/0.log" Apr 23 09:28:06.311110 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:06.311080 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-84bcc947b6-dxbqx_dcc4b0c4-24d9-42a8-826a-aeb82cce2a08/router/0.log" Apr 23 09:28:06.477381 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:06.477335 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-29qjf/perf-node-gather-daemonset-blz29" Apr 23 09:28:06.498897 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:06.498861 2579 status_manager.go:895] "Failed to get status for pod" podUID="86f4bdd5-8b81-4857-90f7-3ed51b9b5298" pod="openshift-must-gather-29qjf/perf-node-gather-daemonset-kk8wn" err="pods \"perf-node-gather-daemonset-kk8wn\" is forbidden: User \"system:node:ip-10-0-131-47.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-29qjf\": no relationship found between node 'ip-10-0-131-47.ec2.internal' and this object" Apr 23 09:28:06.500406 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:06.500377 2579 status_manager.go:895] "Failed to get status for pod" podUID="a10b6daf-3a0a-4a05-afdc-dfdf6f4ac43e" pod="openshift-must-gather-29qjf/perf-node-gather-daemonset-fv82d" err="pods \"perf-node-gather-daemonset-fv82d\" is forbidden: User \"system:node:ip-10-0-131-47.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-29qjf\": no relationship found between node 'ip-10-0-131-47.ec2.internal' and this object" Apr 23 09:28:06.747049 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:06.747012 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-dd2cl_75c84a62-de6b-4301-9660-8d6ba1422a31/serve-healthcheck-canary/0.log" Apr 23 09:28:07.234092 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:07.234062 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-gk2qp_6df0870f-1da9-4821-9995-098c84fe5be1/kube-rbac-proxy/0.log" Apr 23 09:28:07.258970 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:07.258944 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-gk2qp_6df0870f-1da9-4821-9995-098c84fe5be1/exporter/0.log" Apr 23 09:28:07.282831 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:07.282802 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-gk2qp_6df0870f-1da9-4821-9995-098c84fe5be1/extractor/0.log" Apr 23 09:28:07.489822 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:07.489735 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-29qjf/perf-node-gather-daemonset-blz29"] Apr 23 09:28:07.493003 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:07.492976 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-29qjf/perf-node-gather-daemonset-blz29"] Apr 23 09:28:07.521372 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:07.521324 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-29qjf/perf-node-gather-daemonset-vv8kc"] Apr 23 09:28:07.521582 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:07.521567 2579 kubelet.go:2420] "Pod admission denied" podUID="619e6bc3-9ca9-422f-82ff-acbf6c574ad4" pod="openshift-must-gather-29qjf/perf-node-gather-daemonset-vv8kc" reason="Evicted" message="The node had condition: [DiskPressure]. " Apr 23 09:28:07.531258 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:07.531229 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-29qjf/perf-node-gather-daemonset-vv8kc"] Apr 23 09:28:07.531595 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:07.531578 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-29qjf/perf-node-gather-daemonset-vv8kc" Apr 23 09:28:08.482962 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:08.482927 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-29qjf/perf-node-gather-daemonset-vv8kc" Apr 23 09:28:08.487307 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:08.487284 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-29qjf/perf-node-gather-daemonset-vv8kc" Apr 23 09:28:09.485926 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:09.485887 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-29qjf/perf-node-gather-daemonset-vv8kc" Apr 23 09:28:11.554414 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:11.554382 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-29qjf/perf-node-gather-daemonset-vv8kc"] Apr 23 09:28:11.564389 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:11.564316 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-29qjf/perf-node-gather-daemonset-vv8kc"] Apr 23 09:28:11.594191 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:11.594159 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-29qjf/perf-node-gather-daemonset-2tbws"] Apr 23 09:28:11.594438 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:11.594421 2579 kubelet.go:2420] "Pod admission denied" podUID="de68f71b-7802-4219-b87e-aa0b2ec271f0" pod="openshift-must-gather-29qjf/perf-node-gather-daemonset-2tbws" reason="Evicted" message="The node had condition: [DiskPressure]. " Apr 23 09:28:11.611251 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:11.611214 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-29qjf/perf-node-gather-daemonset-2tbws"] Apr 23 09:28:11.611445 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:11.611360 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-29qjf/perf-node-gather-daemonset-2tbws" Apr 23 09:28:12.446295 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:12.446266 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-cmdv2_a9685c08-6c7b-40e4-971e-fd8cbcf83069/migrator/0.log" Apr 23 09:28:12.479836 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:12.479802 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-cmdv2_a9685c08-6c7b-40e4-971e-fd8cbcf83069/graceful-termination/0.log" Apr 23 09:28:12.494999 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:12.494966 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-29qjf/perf-node-gather-daemonset-2tbws" Apr 23 09:28:12.499459 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:12.499438 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-29qjf/perf-node-gather-daemonset-2tbws" Apr 23 09:28:13.498598 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:13.498570 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-29qjf/perf-node-gather-daemonset-2tbws" Apr 23 09:28:14.013784 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:14.013759 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-jwnqb_8e6d8633-b9de-4e37-96e7-465a09675f90/kube-multus-additional-cni-plugins/0.log" Apr 23 09:28:14.046519 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:14.046493 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-jwnqb_8e6d8633-b9de-4e37-96e7-465a09675f90/egress-router-binary-copy/0.log" Apr 23 09:28:14.082587 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:14.082556 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-jwnqb_8e6d8633-b9de-4e37-96e7-465a09675f90/cni-plugins/0.log" Apr 23 09:28:14.123957 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:14.123929 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-jwnqb_8e6d8633-b9de-4e37-96e7-465a09675f90/bond-cni-plugin/0.log" Apr 23 09:28:14.179289 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:14.179269 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-jwnqb_8e6d8633-b9de-4e37-96e7-465a09675f90/routeoverride-cni/0.log" Apr 23 09:28:14.297112 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:14.297025 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-jwnqb_8e6d8633-b9de-4e37-96e7-465a09675f90/whereabouts-cni-bincopy/0.log" Apr 23 09:28:14.320260 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:14.320231 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-jwnqb_8e6d8633-b9de-4e37-96e7-465a09675f90/whereabouts-cni/0.log" Apr 23 09:28:14.640119 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:14.640084 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-h2vsr_4b89f18f-6e25-4059-988c-27d5c1a39867/kube-multus/0.log" Apr 23 09:28:14.737742 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:14.737712 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-96m6d_f3e78a7f-f7cc-48d6-a29a-7d418c195aef/network-metrics-daemon/0.log" Apr 23 09:28:14.769668 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:14.769643 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-96m6d_f3e78a7f-f7cc-48d6-a29a-7d418c195aef/kube-rbac-proxy/0.log" Apr 23 09:28:15.771640 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:15.771593 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9bsnx_33cb4bff-3a7e-42a3-8d3d-e1c79d36437b/ovn-controller/0.log" Apr 23 09:28:15.799263 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:15.799233 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9bsnx_33cb4bff-3a7e-42a3-8d3d-e1c79d36437b/ovn-acl-logging/0.log" Apr 23 09:28:15.809596 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:15.809573 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9bsnx_33cb4bff-3a7e-42a3-8d3d-e1c79d36437b/ovn-acl-logging/1.log" Apr 23 09:28:15.836260 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:15.836231 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9bsnx_33cb4bff-3a7e-42a3-8d3d-e1c79d36437b/kube-rbac-proxy-node/0.log" Apr 23 09:28:15.863774 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:15.863746 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9bsnx_33cb4bff-3a7e-42a3-8d3d-e1c79d36437b/kube-rbac-proxy-ovn-metrics/0.log" Apr 23 09:28:15.888536 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:15.888510 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9bsnx_33cb4bff-3a7e-42a3-8d3d-e1c79d36437b/northd/0.log" Apr 23 09:28:15.918639 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:15.918611 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9bsnx_33cb4bff-3a7e-42a3-8d3d-e1c79d36437b/nbdb/0.log" Apr 23 09:28:15.947285 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:15.947250 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9bsnx_33cb4bff-3a7e-42a3-8d3d-e1c79d36437b/sbdb/0.log" Apr 23 09:28:16.057315 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:16.057240 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9bsnx_33cb4bff-3a7e-42a3-8d3d-e1c79d36437b/ovnkube-controller/0.log" Apr 23 09:28:18.072857 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:18.072819 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-4mjb6_bb54c9e2-2568-441d-a59a-ffa007afefd6/network-check-target-container/0.log" Apr 23 09:28:19.354512 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:19.354480 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-zf6q5_83905907-ce4f-4b05-a10c-1a78044f595e/iptables-alerter/0.log" Apr 23 09:28:19.640482 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:19.640451 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-29qjf/perf-node-gather-daemonset-2tbws"] Apr 23 09:28:19.647560 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:19.647530 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-29qjf/perf-node-gather-daemonset-2tbws"] Apr 23 09:28:19.679649 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:19.679619 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-29qjf/perf-node-gather-daemonset-27npt"] Apr 23 09:28:19.679933 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:19.679911 2579 kubelet.go:2420] "Pod admission denied" podUID="87376384-2bab-4547-8e8c-9289c7b0119d" pod="openshift-must-gather-29qjf/perf-node-gather-daemonset-27npt" reason="Evicted" message="The node had condition: [DiskPressure]. " Apr 23 09:28:19.697382 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:19.697331 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-29qjf/perf-node-gather-daemonset-27npt"] Apr 23 09:28:19.697705 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:19.697689 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-29qjf/perf-node-gather-daemonset-27npt" Apr 23 09:28:20.191400 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:20.191367 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-9jfcj_0991960b-d8b1-454f-a21e-8de493704ad2/tuned/0.log" Apr 23 09:28:20.518550 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:20.518474 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-29qjf/perf-node-gather-daemonset-27npt" Apr 23 09:28:20.522919 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:20.522894 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-29qjf/perf-node-gather-daemonset-27npt" Apr 23 09:28:21.521787 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:21.521757 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-29qjf/perf-node-gather-daemonset-27npt" Apr 23 09:28:22.165996 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:22.165966 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-47.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 09:28:22.250023 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:22.249948 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-6dc5bdb6b4-st52h_b0e5f2a3-2fce-4078-99b7-c4be47e87be7/cluster-samples-operator/0.log" Apr 23 09:28:22.272377 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:22.272329 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-6dc5bdb6b4-st52h_b0e5f2a3-2fce-4078-99b7-c4be47e87be7/cluster-samples-operator-watch/0.log" Apr 23 09:28:24.070430 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:24.070405 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-jx26p_8963f0bc-78e1-4f93-a9a9-bba51f04c437/csi-driver/0.log" Apr 23 09:28:24.094991 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:24.094966 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-jx26p_8963f0bc-78e1-4f93-a9a9-bba51f04c437/csi-node-driver-registrar/0.log" Apr 23 09:28:24.118639 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:24.118609 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-jx26p_8963f0bc-78e1-4f93-a9a9-bba51f04c437/csi-liveness-probe/0.log" Apr 23 09:28:35.730335 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:35.730301 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-29qjf/perf-node-gather-daemonset-27npt"] Apr 23 09:28:35.734609 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:35.734585 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-29qjf/perf-node-gather-daemonset-27npt"] Apr 23 09:28:35.762837 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:35.762806 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-29qjf/perf-node-gather-daemonset-s6pdq"] Apr 23 09:28:35.771139 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:35.771122 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-29qjf/perf-node-gather-daemonset-s6pdq" Apr 23 09:28:35.777824 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:35.777798 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-29qjf/perf-node-gather-daemonset-s6pdq"] Apr 23 09:28:35.949533 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:35.949494 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/dd84da3c-6425-48b5-8f73-7b04b3197a27-lib-modules\") pod \"perf-node-gather-daemonset-s6pdq\" (UID: \"dd84da3c-6425-48b5-8f73-7b04b3197a27\") " pod="openshift-must-gather-29qjf/perf-node-gather-daemonset-s6pdq" Apr 23 09:28:35.949533 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:35.949536 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/dd84da3c-6425-48b5-8f73-7b04b3197a27-proc\") pod \"perf-node-gather-daemonset-s6pdq\" (UID: \"dd84da3c-6425-48b5-8f73-7b04b3197a27\") " pod="openshift-must-gather-29qjf/perf-node-gather-daemonset-s6pdq" Apr 23 09:28:35.949741 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:35.949554 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8msp7\" (UniqueName: \"kubernetes.io/projected/dd84da3c-6425-48b5-8f73-7b04b3197a27-kube-api-access-8msp7\") pod \"perf-node-gather-daemonset-s6pdq\" (UID: \"dd84da3c-6425-48b5-8f73-7b04b3197a27\") " pod="openshift-must-gather-29qjf/perf-node-gather-daemonset-s6pdq" Apr 23 09:28:35.949741 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:35.949573 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/dd84da3c-6425-48b5-8f73-7b04b3197a27-sys\") pod \"perf-node-gather-daemonset-s6pdq\" (UID: \"dd84da3c-6425-48b5-8f73-7b04b3197a27\") " pod="openshift-must-gather-29qjf/perf-node-gather-daemonset-s6pdq" Apr 23 09:28:35.949741 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:35.949703 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/dd84da3c-6425-48b5-8f73-7b04b3197a27-podres\") pod \"perf-node-gather-daemonset-s6pdq\" (UID: \"dd84da3c-6425-48b5-8f73-7b04b3197a27\") " pod="openshift-must-gather-29qjf/perf-node-gather-daemonset-s6pdq" Apr 23 09:28:36.050103 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:36.050023 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/dd84da3c-6425-48b5-8f73-7b04b3197a27-podres\") pod \"perf-node-gather-daemonset-s6pdq\" (UID: \"dd84da3c-6425-48b5-8f73-7b04b3197a27\") " pod="openshift-must-gather-29qjf/perf-node-gather-daemonset-s6pdq" Apr 23 09:28:36.050103 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:36.050070 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/dd84da3c-6425-48b5-8f73-7b04b3197a27-lib-modules\") pod \"perf-node-gather-daemonset-s6pdq\" (UID: \"dd84da3c-6425-48b5-8f73-7b04b3197a27\") " pod="openshift-must-gather-29qjf/perf-node-gather-daemonset-s6pdq" Apr 23 09:28:36.050103 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:36.050088 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/dd84da3c-6425-48b5-8f73-7b04b3197a27-proc\") pod \"perf-node-gather-daemonset-s6pdq\" (UID: \"dd84da3c-6425-48b5-8f73-7b04b3197a27\") " pod="openshift-must-gather-29qjf/perf-node-gather-daemonset-s6pdq" Apr 23 09:28:36.050103 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:36.050105 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8msp7\" (UniqueName: \"kubernetes.io/projected/dd84da3c-6425-48b5-8f73-7b04b3197a27-kube-api-access-8msp7\") pod \"perf-node-gather-daemonset-s6pdq\" (UID: \"dd84da3c-6425-48b5-8f73-7b04b3197a27\") " pod="openshift-must-gather-29qjf/perf-node-gather-daemonset-s6pdq" Apr 23 09:28:36.050405 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:36.050185 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/dd84da3c-6425-48b5-8f73-7b04b3197a27-proc\") pod \"perf-node-gather-daemonset-s6pdq\" (UID: \"dd84da3c-6425-48b5-8f73-7b04b3197a27\") " pod="openshift-must-gather-29qjf/perf-node-gather-daemonset-s6pdq" Apr 23 09:28:36.050405 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:36.050196 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/dd84da3c-6425-48b5-8f73-7b04b3197a27-lib-modules\") pod \"perf-node-gather-daemonset-s6pdq\" (UID: \"dd84da3c-6425-48b5-8f73-7b04b3197a27\") " pod="openshift-must-gather-29qjf/perf-node-gather-daemonset-s6pdq" Apr 23 09:28:36.050405 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:36.050197 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/dd84da3c-6425-48b5-8f73-7b04b3197a27-podres\") pod \"perf-node-gather-daemonset-s6pdq\" (UID: \"dd84da3c-6425-48b5-8f73-7b04b3197a27\") " pod="openshift-must-gather-29qjf/perf-node-gather-daemonset-s6pdq" Apr 23 09:28:36.050405 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:36.050233 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/dd84da3c-6425-48b5-8f73-7b04b3197a27-sys\") pod \"perf-node-gather-daemonset-s6pdq\" (UID: \"dd84da3c-6425-48b5-8f73-7b04b3197a27\") " pod="openshift-must-gather-29qjf/perf-node-gather-daemonset-s6pdq" Apr 23 09:28:36.050405 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:36.050336 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/dd84da3c-6425-48b5-8f73-7b04b3197a27-sys\") pod \"perf-node-gather-daemonset-s6pdq\" (UID: \"dd84da3c-6425-48b5-8f73-7b04b3197a27\") " pod="openshift-must-gather-29qjf/perf-node-gather-daemonset-s6pdq" Apr 23 09:28:36.058221 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:36.058203 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8msp7\" (UniqueName: \"kubernetes.io/projected/dd84da3c-6425-48b5-8f73-7b04b3197a27-kube-api-access-8msp7\") pod \"perf-node-gather-daemonset-s6pdq\" (UID: \"dd84da3c-6425-48b5-8f73-7b04b3197a27\") " pod="openshift-must-gather-29qjf/perf-node-gather-daemonset-s6pdq" Apr 23 09:28:36.081315 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:36.081284 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-29qjf/perf-node-gather-daemonset-s6pdq" Apr 23 09:28:36.200623 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:36.200598 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-29qjf/perf-node-gather-daemonset-s6pdq"] Apr 23 09:28:36.203168 ip-10-0-131-47 kubenswrapper[2579]: W0423 09:28:36.203137 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poddd84da3c_6425_48b5_8f73_7b04b3197a27.slice/crio-de0aacbb2d7910e0b42d411a9fbd7706c70124c021c41c8fa03208186178fa23 WatchSource:0}: Error finding container de0aacbb2d7910e0b42d411a9fbd7706c70124c021c41c8fa03208186178fa23: Status 404 returned error can't find the container with id de0aacbb2d7910e0b42d411a9fbd7706c70124c021c41c8fa03208186178fa23 Apr 23 09:28:36.569930 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:36.569894 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-29qjf/perf-node-gather-daemonset-s6pdq" event={"ID":"dd84da3c-6425-48b5-8f73-7b04b3197a27","Type":"ContainerStarted","Data":"3f658579179c588fae2e8903e421088afe72ce747015c7303a839b37d0d19e97"} Apr 23 09:28:36.569930 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:36.569931 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-29qjf/perf-node-gather-daemonset-s6pdq" event={"ID":"dd84da3c-6425-48b5-8f73-7b04b3197a27","Type":"ContainerStarted","Data":"de0aacbb2d7910e0b42d411a9fbd7706c70124c021c41c8fa03208186178fa23"} Apr 23 09:28:36.570130 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:36.569953 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-29qjf/perf-node-gather-daemonset-s6pdq" Apr 23 09:28:36.589620 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:36.589514 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-29qjf/perf-node-gather-daemonset-s6pdq" podStartSLOduration=1.58949783 podStartE2EDuration="1.58949783s" podCreationTimestamp="2026-04-23 09:28:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 09:28:36.589035754 +0000 UTC m=+2432.118515780" watchObservedRunningTime="2026-04-23 09:28:36.58949783 +0000 UTC m=+2432.118977834" Apr 23 09:28:42.583324 ip-10-0-131-47 kubenswrapper[2579]: I0423 09:28:42.583287 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-29qjf/perf-node-gather-daemonset-s6pdq"